Will Expert Systems Solve the Operators' Problems ?

Applying the ideas in Ironies of Automation to the design of expert systems, this is mainly a huge list of problems to consider, rather than solutions.  Any advice is about general principles, not detailed specifics.

And this was published 30 years ago.  No doubt there are now many more sophisticated ideas for support systems, which need to be assessed for whether they are really are helping.


There are many generalisations here about how people do complex dynamic tasks, and what this implies for the support they need.  For evidence for these generalisations, see the papers listed in the first section of the Home page.


Issues to consider :


1.  Introduction 


2.  The limits to decision support systems / expert systems


2.1. When the structure and function of the process can be fully specified :

2.1.1. Cost benefits.

2.1.2. Displacing the locus of human errors in the system.

2.1.3.  Real-time operation.

2.1.4.  Operators with a 'social' role.


2.2.  When the process cannot be fully specified :

2.2.1. Assessing the computer's performance

2.2.2.  Adding to the workload.

2.2.2.1.  Task Complexity

2.2.2.2.  Increased number of tasks

2.2.3.  Increasing tunnel vision.

2.2.4. Increasing the effect of error

2.2.5.  Lack of opportunities to practice.

2.2.6.  Support system design.


3.  Experience, skill and workload


3.1. Reducing workload by developing and maintaining cognitive skills :

3.1.1. Developing skill

3.1.2. Maintaining skill

3.1.3.  Reducing opportunities for using skill by reducing opportunities for anticipation


3.2.  Increasing workload by increasing unfamiliarity :

3.2.1.  Interface changes.

3.2.2. Changes in responsibility.

3.2.3.  Procedures or intelligence.


4.  Additional problems with the 'replacement' strategy of automation

4.1. Manual Control Skill

4.2. The importance of maintaining interest, and self-value


5.  Aspects of the Design of Collaborative Systems


5.1. Aids which have been suggested as technical possibilities


5.2.  Job-aid suggestions based on studies of operators :

5.2.1. Aids - better information from which the operator can make decisions

5.2.2.  'Distributed  Cognition' - the computer as a member of the operating team


5.3.  Ergonomics criteria, ease of use :

5.3.1.  Displays of multiple goals, and displays to aid anticipation

5.3.2.  Ease of collaboration

5.3.2.1.  Different types of knowledge and reasoning

5.3.2.2.  The computer as a member of the team

5.3.2.3.  Giving help

5.3.2.4.  Organisational structure




Will Expert Systems Solve the Operators' Problems ?


LISANNE BAINBRIDGE


Department of Psychology, University College London

London WC1E 6BT, England


(1990)  In Roe, R.A., Antalovitz, M. and Dienes, E. (eds.), Proceedings of the Workshop on Technological Change Process and its Impact on Work. September 9-13, Siofok, Hungary, pp. 197-218. 

Translation published as : (1991) Les systèmes experts resoudront-ils tous les problèmes des operateurs ? In Neboit, M. and Fadier, E. (eds.) Proceedings of the Colloquium on Facteurs Humains de la Fiabilité et de la Securité des Systèmes Complexes. April, INRS, Nancy, France, pp. 17-26.




1.  Introduction


In complex industrial processes, such as power stations and steel works, classic 'control' automation has taken over much of the feedback control of the plant from the human operator.  When Decision Support Systems or Expert systems are used, to schedule, plan, fault diagnose, or suppress alarms, then some of the thinking and problem solving needed to operate the plant is automated.  This could be called 'cognitive' automation.  In 'control' automation, the irony that automation can make the operators' tasks more, not less, difficult is increasingly recognised (e.g. Wiener & Curry, 1980; Bainbridge, 1983).  The same can be true of cognitive automation, for many of the same reasons.  


Whenever some of the operating tasks are automated, this raises questions about the policy for allocating functions between automatics and human operators. Basically there is a choice of two strategies :

- 'replacement' , in which automatic devices do everything which can be automated, and the human operator does the remaining tasks.  In particular, this means that the human operator deals with unusual situations which have not been anticipated in the design of the automated system.

- 'collaboration', in which computer and operator work together.  'Intelligent' computers now make it possible for the computer to act as an assistant during periods of high workload, a consultant providing additional expertise, or as an instructor or supervisor. 


This paper will discuss some of the problems which arise in the design of these automated systems, and will focus on three groups of issues :

- if parts of the task are done by computer, the human operators' ability at doing these tasks will not develop, and this may cause problems when the computer is not available,

- what help does an operator actually need ?  much development in this area focusses on what is technically possible, rather than arising out of studies of what might be useful to the operators,

- the ergonomics questions of devising automated devices for ease of use, so that the human operator can understand what they are doing and how to work with them.


The paper will be in the following sections :

- problems with Expert Systems/ Decision Support Systems, whatever the allocation policy,

- the relation between experience, skill and workload,

- problems with the replacement strategy of automation,

- aspects of the design of collaborative systems :

- the technically possible aids,

- the aids which would be useful to the operator,

- cognitive ergonomics questions about team work and advice giving.


2. The Limits to Decision Support Systems/ Expert Systems


The terms Decision Support System and Expert System do not as yet have unique meanings.  Many of the points made in this paper apply to the full range of possibilities for supporting the operator, from a printed check-list of reminders to a computer which can follow chains of reasoning or reproduce the knowledge of experts.  The issues have become more important because of the very much richer possibilities for helping or replacing the operator now that computers which are able to store knowledge and to reason have become available. 


In this sections, the points are made under two headings :

- points which apply when the process can be fully specified to the computer,

- points which are important in the more likely situation that the way the process works cannot be described completely.


2.1. When the structure and function of the process can be fully specified.

It is only possible to replace the human operator completely, if the process being controlled can be specified completely, a complete description of it can be in the computer (or, see Section 2.1.1, it is sufficiently well understood to be shutdown automatically and safely when something goes wrong).  


There are many people who do think that it is possible to specify the structure, function and behaviour of a complex process completely, although this view is not usually held by people with production experience.  Consider the situation in which there are multiple unrelated faults.  (For example, in the Three Mile Island incident, many of the emergency systems which the operators tried to use did not work properly.)  To be able to pre-specify what to do in every possible situation, it is necessary to anticipate every situation.  But in a complex process, the number of possible fault combinations is astronomic.  And while it might be possible to work out all the possible fault combinations, it is often not possible to predict what will happen in each of them, when neither they nor anything like them has been experienced before.  It is even more unlikely that it would be possible to predict all the possibilities for common cause failures.  For example, an aircraft crash, fire, earthquake, or explosion would mean that parts of the plant fail together which are geographically, not functionally, related.


However, even if all the possibilities could be anticipated, there would still be at least four areas of concern :

- cost-benefits,

- displacing human error to another part of the system,

- real-time operation,

- the role of the 'socially necessary' operator.

(These are also concerns when the process cannot be completely pre-specified.)


2.1.1. Cost benefits.

If the process is to be fully specified in the computer, for a complex plant this will be very expensive, both in software development and in process monitoring.  For example, the radiation alarm in a nuclear power station can be set off by water on the radiation transducer.  For a computer to allow for this, the computer would need to have data from a water transducer on the radiation transducer.  So it is necessary to ask at what point the cost-benefit of fully specifying the process to the computer breaks down.  If it is not cost effective to monitor the process fully, then a human operator will be needed to deal with the situations which are not automated, unless :

- automatic shut-down is possible,

- the number of items not fully specified, and their probability of failure, are low.

- the costs of unnecessary shut-down are not high,

If an operator is needed, then all the points in Sections 2.2 and 3 need to be considered.  Crucially, it may be necessary to rethink the system design so that the operators can fulfil effectively the tasks allocated to them.


2.1.2. Displacing the locus of human errors in the system.

One of the reasons for enthusiasm about automation, apart from increased product quantity and quality, is the possibility of reducing the impact of errors by the operators.  But if human operators are replaced by complex software, then reduced on-line errors by the operators will be replaced by increased off-line (and often much more difficult to trace) errors in software.  Reason (?) reviews studies which have found that such 'passive' errors are more frequent than the 'active' errors which occur during operation, even in classic plant designs.  It will be essential for the systems analysis which fully specifies the process to be completely correct, and for the keyboard operators typing in the software to be fully accurate.  Otherwise software reliability problems will be magnified and, until fully reliable error-correcting systems are available, it will be necessary during system development to consider who will have the knowledge and skills to notice, identify and solve the problems caused by these 'designer' errors.


2.1.3.  Real-time operation.

Large expert systems may take some time to reach a conclusion.  This does not matter in static situations, but when there are time constraints, as in thinking out what to do while controlling a process or an aircraft, then, if it is not possible for the computer to come to a conclusion in the time available, it is not much use.  And amusingly, if the computer uses heuristics rather than algorithms so that it can reach conclusions quickly, then it will make some of the same sorts of errors as human beings.  And the control system will not be fully specified in practice.


2.1.4.  Operators with a 'social' role.

Even if a computer for operating the system can be fully specified, it may still be necessary to have an operator in the system for reasons of public confidence.  Passengers accept driverless trains on simple railway tracks, but not pilotless aircraft.  Such 'unnecessary' operators would have a public relations, rather than an operating, role.  But the allocation of responsibility may become ambiguous.  To gain public confidence, the operator must be seen to have real responsibility in the system.  But if this is so, they must be given real responsibility.  And they will not be able to exercise it unless they are given real tasks, or frequent simulator training, by which they can develop and maintain the necessary knowledge and skills (see Section 3).


2.2.  When the process cannot be fully specified.

If the process cannot be fully pre-specified, and the computer system can only deal with the simpler situations, a human operator will be needed.  There will be some of the same problems as with control automation, and some new problems :

- the operator needs to be able to assess the computer system,

- an incomplete support system can add to the operators' workload,

- an incomplete support system which is nearly always correct can encourage tunnel vision,

- a badly designed support system can multiply the effect of human error,

- an incomplete system which does most of the task can remove the operators' opportunities to practice skills and knowledge,

- providing a support system is only one possible way of improving operator performance, and others may have greater cost-benefits.


2.2.1. Assessing the computer's performance

If the computer system is incomplete, then something has to assess whether its performance is acceptable.  This is frequently supposed to be done by the human operator.  The operator might either assess how the computer does the task, or whether the result of the computer's activities is acceptable.  It is frequently inappropriate to ask the human operator to assess how the computer does the task, because the computer does the task in a non-human way.  This may indeed be the reason why this task was allocated to it, for example if the computer :

- uses a different type of reasoning,

- uses complex mathematics,

- works at levels of speed and accuracy not perceivable by human senses.


There are basic ironies to this monitoring.

- Biological sensing systems respond to change.  Human beings are inefficient at monitoring steady-state situations.  It is computers and hardware which are good at this, so there is a non-optimum allocation of function.

- In addition, the computer has been allocated a function because it can do this better than an operator.  How then does the operator have the ability to monitor that the computer is doing the task correctly ?


It is usually more appropriate for the operator the assess the result of the computer's activities, for example to check whether what it suggests should be done is actually appropriate.  We know very little about how good people are at doing this.  The operators' knowledge, skill and workload are likely to be relevant, as mentioned in other sections of this paper.


An additional serious problem is that computer controlled systems mask system failure (Wiener, 1985).  As process variables go out of tolerance after a fault, so the automatics compensate for this, until this is not longer possible, when there is a step-change in the process state.  This step-change is much more difficult to deal with than if the evidence-for-failure trends had been identified earlier.  If operators monitor the process variables directly, they can :

- notice that variables are trending towards out of tolerance, and take anticipatory action to prevent this,

- notice when the automated compensatory actions do not have the expected effect, so there must be a fault in the process or the automation.


2.2.2.  Adding to the workload.

One of the aims of automation may be to reduce the workload on the operator, but the actual effect is frequently the opposite.


There are two basic components of workload :

- the objective complexity and number of tasks to be done in a given time,

- the time taken to do each of them.

The time an operator takes to do a task depends on such factors as interface design and the operators' experience, which will be noted in Section 3.  This section illustrates some of the ways in which automation can increase the number and complexity of the tasks the operator has to do.


2.2.2.1.  Task Complexity

Understanding a process which has a computer component is more difficult.  For example,  to work out what is wrong during a fault, and the actions required and possible to rectify the situation, the operators need to understand the computer decision and control systems, not just the plant.


2.2.2.2.  Increased number of tasks

Here are four of the possibilities :

- if the operators have to check what the computer is doing, this inherently increases their workload.

- if the operators deal with increasing workload by changing their strategy (e.g. Sperandio,  1971) and the computer aid is not compatible with a variety of strategies, using it will add to the workload.

- Alarms on the process, nuisances and loss of information. It is technically simple to check whether a process variable is within specified limits, and to sound an alarm if it is not.  An alarm usually means that something is wrong.  But there are actually several situations in which it may be appropriate for a variable to be outside the specified limits, for example during start-up or shut-down.  So there may be many 'nuisance' alarms, alarms which the operator has to respond to unnecessarily.  And if many of the alarm signals are irrelevant, this reduces the power of any one alarm signal to attract the operators' attention.  Alarm suppression systems are beginning to be developed, so that limit values are only alarmed when the process is in a state in which they are appropriate.  But again, as with fully specifying the process, it is difficult to ensure that all the possibilities are covered, and if they are not, an alarm suppression system may reduce the amount of information available to the operator.

A more effective approach to reducing workload in alarm handling may be to use the Gestalt abilities of the human eye to process items in parallel.  If alarm lights are displayed within a plant mimic then operators can see which items are related (e.g. Reiersen et al, 1987).  Even conventional discrete alarms could be interpreted more quickly if grouped according to part of plant, or according to type of information (e.g. danger, out of tolerance, operating status).

- Alarms on the operators' actions.  Again these will be nuisance alarms, if the system does not know enough to know that a given action is actually appropriate in the particular context.  And this is more likely to be the case in more unusual circumstances, when the operators are more likely to make unusual actions.


2.2.3.  Increasing tunnel vision.

Strictly speaking, 'tunnel vision' is a term for the way in which people's attention becomes limited when they are under stress or tired.  People may then notice only the main features of the task, or literally only what is immediately in front of them.  This Section uses the term in a more general sense, to mean any situation in which an operator considers only a narrow range of possibilities.  There are two obvious examples :

-  Human beings do not assess probabilities with great accuracy.  If something is very likely, we assume that it always happens.  If something is very unlikely, we tend to assume that it will never happen.  This means that if a computer support system is very good, and rarely makes mistakes, the operator will stop bothering to assess whether its decision is correct before confirming it.  This has three effects :          

        - the 'check' just becomes a nuisance activity of making an unnecessary 'check' action,

- the assumed safety function, provided by the human operator assessing the automatics, has been lost. 

- the operator has effectively handed over the task to the computer.  This is in practice equivalent to replacement, with all the attendant problems of loss of skill and motivation, as discussed in Section 3.


The second example illustrates that a decision support system can increase the very thing which it is supposed to prevent.  A decision support system may have been provided to break the operators' tunnel vision, or 'mental set' during problem solving, perhaps by giving a list of reminders which draws the operators' attention to things which they may have ignored.  If in simple cases these suggestions are always right, although the list is actually incomplete for complex situations, then (as above) the operator will come to trust the support, and will loose the skills and knowledge to think beyond what is given in unusual circumstances.  Their attention will be limited to only those things suggested by the support list, and this will give them a false sense of security.


2.2.4. Increasing the effect of error

Automation may multiply the effect of human error, as in 'automation induced failure'  (Wiener and Curry, 1980).  For example, if there is a human error in setting up the operation of an automatic device, which is not designed so that it is easy to check that it is operating correctly, then the automated device can, by following the wrong instructions, magnify the effect of the original small error.


2.2.5.  Lack of opportunities to practice.

Human skills, knowledge and access to knowledge are only maintained by frequent practice.  The more of the operators' task that is replaced by automatic devices, the less practice the operator will get in thinking about the process and using the interface, so they will be less able to do these tasks when they do need to do them.  Their work will be more difficult.  See more in Section 3.


2.2.6.  Support system design.

Computer based support systems are not the only, or always the best, solution for supporting the human operator.  'Error recovery' systems make a good example.  Why spend a great deal of money developing an incomplete (and therefore possibly interfering) system for helping an operator to recover from errors, when a smaller amount of money invested in developing a better interface and training would mean that the operator did not make so many errors in the first place.  Given the right information and facilities, human beings can be good at recognising and recovering from their errors.  Section 2.2.2.1 gives another example, of interface design to support alarm analysis by the operator.


A computer based system is only one of many solutions to helping the operator and ensuring better operation and fault handling, and should not be used just because it is fashionable and interesting or fun to develop.


3.  Experience, skill and workload

There are three basic ways to do a task, which involve increasing amounts of mental workload (Bainbridge, 1989) :

- using perceptual-motor skill, which requires no conscious attention,

- in familiar tasks, the relevant cognitive skill are fully developed and readily available.  These include (Bainbridge, 1992) :

- the working methods and reference knowledge needed,

- the overviews in working memory, of the current state of the plant, anticipated events, plans of action, etc.

- in unfamiliar tasks, the person has to think out what is an appropriate structure to the tasks, and what to do, i.e. to solve problems.

(Note that the key difference is the familiarity/ unfamiliarity to the individual, not whether this is a normal/ abnormal situation.  If an operator has experienced a fault or emergency situation in real or simulated conditions, then it should be more familiar, and easier to handle.)


Using problem solving is the most mentally demanding way of doing a task.  The amount of problem solving that an operator has to do will depend on :

- the amount of opportunity they have had to develop familiarity/ cognitive skills for this situation,

- anything which increases the unfamiliarity of the situation.


Anything which increases the workload in a situation increases the number of errors which may be made, the number of tasks which may be omitted, and the probability of physical stress effects.


3.1. Reducing workload by developing and maintaining cognitive skills


3.1.1. Developing skill

A person doing a task which is unfamiliar to them will be uncertain about what may happen, and not know much about what to do when something does happen.  An experienced/ skilled person, someone in a familiar situation, knows what to expect, and has ready developed methods for responding.  So the same objective task requirement causes more mental workload for an inexperienced person than for an experienced person.


3.1.2. Maintaining skill

Human memory works in such a way that if knowledge is not used, the memories deteriorate and become more difficult to access.  From the everyday point of view, it is good to have a memory which most readily supplies the currently most used information, but from the point of view of manual take-over from automated systems this is more of a problem.  It means that people not only have to be trained to make situations familiar, but that also, to maintain ready access to this knowledge, it has to be used at frequent intervals.  The operator may be in the system to provide intelligence and problem solving abilities for dealing with situations which the system designers were not able to automate, but active steps need to be taken by the task designers to ensure that the operators' tasks are ones which maintain their cognitive skills (and interest and motivation).


3.1.3.  Reducing opportunities for using skill by reducing opportunities for anticipation

Normally, when an operator is involved in ongoing control of a plant, they think about what to do within the context of knowing :

- the present status of the plant,

- what has happened recently,

- what is expected to happen in the near future.

Much of their activity is done in anticipation.  This has two effects :

- as already mentioned (Section 3.1), people can react more quickly, and stress is lower, when they can anticipate what is going to happen, so fewer alternatives are expected and uncertainty is lower.

- if events are anticipated, then thinking out what to do about them can be done in advance during periods of low workload, rather than after the event.

Both of these minimise the amount of mental work which must be done under time pressure.


In contrast, when people react to an unexpected event, particularly when they have to take over manual operation, they :

- have no expectations about what is going to happen, so have to identify events from scratch,

- do not know the current status of the plant, or the actions available, so have to find these out from scratch,

- have been unable to decide in advance what to do, so have to think this out from scratch.

- if they have to act quickly they will not have full information, and their choice of actions will not take future events into account, so will be less effective.


So they not only have uncertainty because it is an unusual event, they also have much higher workload than they would have had if dealing with the same event which they had been able to anticipate.


These sorts of points have several practical implications, such as that :

- if possible people should do some task which requires them to keep up-to-date with the status of the plant, and what is expected to happen (unless unusual events can develop so quickly that they cannot be anticipated). And this should be a task in which they need to use the information, not just a trivial non-task such as ticking a check list.

- people should have experience with unusual situations, so that the events, and how to handle them, are not unfamiliar.


3.2.  Increasing workload by increasing unfamiliarity

There are many ways in which a take-over or emergency situation can increase unfamiliarity, and so workload and stress.


3.2.1.  Interface changes.

When the operator has to take over, they return to an operating role which is more like their pre-automation role.  It will be more difficult for them to do the task, if the interface which they are used to using does not support this changed role, or if there is a special and unfamiliar interface for use in unusual circumstances (e.g. in an emergency control  room).  People using a well-designed familiar interface know where the information is that they need, how to interpret it, and how to make actions.  If the interface is unfamiliar, or unfamiliar for the given task requirements, they will be uncertain about where to look, and uncertain about what may happen and what to do about it.  So visual search (and problem solving to find information as well as to do the task) will be needed, response times will be longer, and workload will be greater.


It is best if possible to have an interface which can be used for several modes of working.  And these different modes of working should have been practised, so that they are familiar, and give rise to as little unnecessary workload as possible.


3.2.2. Changes in responsibility.

If the computer is not available, that changes the allocation of responsibility for different tasks.  In unusual situations, such as major faults, other personnel enter the control room, who have either higher levels of responsibility in the plant, (e.g. the plant manager) or more specialised types of knowledge (e.g. the health physics department).  In such cases, there is a change both in responsibility and in communications.  Again these may lead to uncertainty, and so to increased workload, unless the people have practised being in this enlarged decision-making team, so the situation is familiar.


3.2.3.  Procedures or intelligence.

Typically, in process plant, operators are given procedures, pre-specified instructions about what to do in fault situations.  But, because of the possibility of unusual fault combinations (Section 2.1), they are often also expected to assess whether the procedures are appropriate, and to work out what to do if they are not.  Indeed, Gall (1990) classifies failure to analyse a procedure critically as a human error leading to a nuclear incident.  Apart from the difficulty this assessment of procedures may cause over ambiguity of responsibility, the operators also have the problem that if they usually follow procedures, for which it is not necessary to think, they will not develop or maintain the thinking strategies and knowledge which they need, to be able to work out for themselves what to do (Bainbridge, 1989).


4.  Additional problems with the 'replacement' strategy of automation


At least one study (Roth, Bennett and Woods, 1987)  has found that using a computer didactically to tell the operator what to do leads to less effective system performance than if the operator is actively involved in the problem solving.  Evidently more research should be done on this, as this is contrary to the basic assumption of most of the devices being developed.


The basic irony of both control and cognitive automation, if they use the replacement strategy with manual take-over, is that while the original aim was to reduce reliance on the human operator, in fact the more successful the automation is, the more of the task that is replaced, the more crucial and difficult become the tasks left for the operator.  If the operator makes a more critical contribution, then greater investment is training is needed to develop and maintain the operators' relevant skills.


There is a number of additional points, which mean that the replacement approach to allocation of function in complex system may optimise everyday operation at the expense of operation in unusual circumstances.


4.1. Manual Control Skill

Section 3 concentrates on the need to develop and maintain the operators' cognitive skills.  Although it is frequently claimed that operators do not need manual control skills in automated systems, in fact some major incidents (e.g. Three-Mile Island, Chernobyl) have been exacerbated because the operators did not know how to control part of the process.  Therefore there are some important questions on manual control skill :

- what levels of manual control skill are needed during manual take-over ?  e.g. can a step-change be made by 'inching' to the new position, or do actions needs to be made with the correct size and timing, because if not the process will go into oscillation ?

- what is the minimum amount, type, and frequency of refresher training needed to maintain these different levels of control skill ?


4.2. The importance of maintaining interest, and self-value

If people have very little to do, and/or they usually make what is clearly a low quality contribution to the system, then as well as problems with cognitive skill (see Section 3) :

- their thinking and control skills will deteriorate,

- they will be unlikely to remember quickly the relevant information,

- they will be unlikely to be able to build up a wide mental overview of what is happening in the plant,

also :

- the amount of attention which they pay to the task will diminish,

- job involvement will go down : that is, the sense of responsibility will go down, the perceived value of the work will go down, and absenteeism will go up.


The classic problem with the 'replacement and manual take-over' strategy is that the operators spend most of their time with very little to do, with occasional periods with too much to do,  When human beings have too little to do, their skill, motivation, and attention decrease.  A standard way of reducing workload is to increase skill, which means that each task can be done more efficiently.  So there is a common solution to the problems of both everyday under-load, and unusual overload, which is to give the operators tasks which maintain their skills and motivation.  For discussions of this need, see Wiener and Curry, 1980; Bainbridge, 1983; Wiener, 1985; Price, 1985.


There are so many difficulties with the replacement strategy of automation, whether control or cognitive, that it is not surprising there is much interest in using the computer in collaboration, or as an expert which can be referred to.  The important aspect is that the required job skills and interest should be maintained inherently through the job design and allocation of function.  This is the 'collaboration' approach, which is discussed in the remaining Sections.


So if the replacement strategy of automation is used, whether in classic or cognitive automation, then training will be needed to maintain skills/ familiarity, and the job will need to be designed to maintain both these skills and the operators' job involvement.


5.  Aspects of the Design of Collaborative Systems


Most of the development which is done in automation is technology driven, and ignores the tasks and problems left for the operator.  This is true even in much of the research in developing what are intended to be computer aids for the operator.  This section will concentrate mainly on studies which have investigated the needs of the human operator.


5.1. Aids which have been suggested as technical possibilities

There are several reviews of what expert systems can do, categorised in terms of the human cognitive tasks which they could replace or support, e.g. Rouse (1981), Zachary (1986), Zimolong et al (1987).  There are also several reviews of ways in which conventional computers and expert systems can be used as aids to the process operator, e.g. papers in Hollnagel et al (1986), Mancini et al (1986).  The following lists some of the possibilities which have been suggested.  Some of these can be done only with advanced computer technology, others can be done technically in several ways : these have not been distinguished.

Technically possible, such as :

Classic

monitoring,

interlocks,

control.

Prediction

with or without monitoring,

'aiding' (i.e. displaying immediately what will be the long-term effect of an action,

game playing (to test alternative strategies).

Check lists of

operating procedures,

priorities and constraints,

available actions,

causes of failure,

error recovery methods,

dimensions of decision.

Calculation and decision making

ratio scaling of probabilities and costs,

proper weighting of new evidence,

multi-attribute decision making,

risky decision making.

Special display formats

stored history,

integrated displays, especially of composite measures,

keeping track of predicted/ planned alternatives,

displaying sub-set of potential data, which is specific to task, skill type, or cognitive style of user,

allowing operators to design their own interface.

Computer monitors operator and

changes display as appropriate for task

takes over if workload high,

monitors eye-movements or actions, and informs operator if these are not

correct.

Computer reasoning, whether autonomous or to give advice to the operator

on state interpretation

critical function monitoring

alarm suppression,

diagnosis,

on actions, how to

meet goal,

recover from fault or error,

plan,

question answering and explanation,

assisting operators with different cognitive styles to co-operate.


Typically, most of the items on this list have been suggested because they are (partly) technically possible, not because they are known to be helpful to the operator.  These items are based on insufficient understanding of what operators need help with, and of what causes them difficulty.  When reviewed from the human factors point of view, some of these possibilities have potential for considerable increasing the operators' problems, not reducing them.


For example, if the computer automatically changed display formats, this could considerably disrupt the operators' attention and search processes.  Or if an incomplete computer system monitored the operators for incorrect actions,  this could add to their workload by interrupting them unnecessarily.  There is considerable potential for research in assessing whether the listed possibilities would actually help or hinder the operator.


The following sections therefore summarise the possibilities for computer support, from the human factors perspective.  The first section reviews job aids which have been suggested on the basis of studies of the operator, rather than because they are technically possible, as in the above list.  The second section mentions points which need to be considered for the aids to meet the primary ergonomics criterion of ease of use.


5.2.  Job-aid suggestions based on studies of operators

Suggestions for computer use come in two groups, for giving more sophisticated aids to the operators to use in their decision making, or for the computer to be a member of the team operating the process, by adding advice and extra knowledge.


5.2.1. Aids providing better information, from which the operator can make decisions

- reducing information overload (Buttner, 1985),

- making it easier to get a plant overview (Buttner, 1985),

- calculating multi-dimensional measures of the process state, especially inferred measures (Hoc, 1989; de Keyser, 1986; Hoc, 1989),

- helping with prediction of events, or of the effectiveness of proposed actions (Embrey, 1985; de Keyser, 1986; Rothe and Woods, 1988;  Hoc, 1989;  Sanderson, 1989; Amalberti and Deblon, in press).

- (operators in complex plant frequently have to balance multiple competing goals in interactive processes and they need) measures and displays which show how well these goals are met (Rothe and Woods, 1988).


For really crucial variables, it may be necessary to have both hard-wired and computer generated displays of the same variable.  Or some other independent sources of information from which the user can work out whether it is the variable, the transducer, or the computer which is going wrong.


5.2.2.  'Distributed  Cognition' - the computer as a member of the operating team

- adding extra information and methods of working out what to do, to those available in the human operating team, e.g. :

- suggest further possibilities to consider (Woods, 1986),

- help develop and debug a plan to achieve goals (Woods, 1986; Amalberti and Deblon, in press),

- aid with diagnosis (Norman, 1986), e.g. point out the additional information needed to resolve an ambiguity in interpreting the process state (Embrey, 1985),

- taking over workload in a crisis (Norman, 1986).


5.3.  Ergonomics criteria, ease of use

Using a computer to give extra information, or extra advice, can hinder the operators rather than help them, if the information or advice is difficult to understand and use.  The primary ergonomics criteria in design are to minimise unnecessary effort, and increase 'compatibility'.  Information, interfaces and aids should be presented in a way which is compatible with human thought processes.  But while a great deal is known about how to do this for conventional interfaces and simple tasks, we have very little information on how to do this for computer generated (particularly graphic) displays, or advice-giving devices.


5.3.1.  Displays of multiple goals, and for anticipation

There have been several studies which show that these sorts of displays do improve operator performance, but very little advice on the best way to design them.  There are also several studies  (e.g. Carroll et al, 1980; Thorndike and Hays-Roth, 1982; Gibson and Salvendy, 1990) which show that the best display format to use depends on the type of problem to be solved.  So this is evidently an important area for research.


5.3.2.  Ease of collaboration

What little research has been done in this area has mainly been related to human-computer interaction in such contexts as office automation and programming, rather than in the control of a complex dynamic environment.  However, these pioneering studies do suggest what sort of questions are important.


5.3.2.1.  Different types of knowledge and reasoning

- what are the different types of knowledge ?  (Walker and Stevens, 1986; Bainbridge, 1988)

- format : graphic, verbal, etc.,

- content : goals, functions, physical structure, events over time, etc.,

-  what is the best form of advice ?  e.g. Moll and Fischbacher (1989) found, for numerically controlled tool operators, the best advice depended on how well they knew the task.  People who had clear goals, and know the state of the system, needed reminders about what operations were available.  Other people needed a tutorial which helped them to understand the task, by focussing on task goals.

- at what level of detail, and breadth of scope, should advice be given ?  (Woods and Roth, 1988). 

- there are many different strategies for human reasoning and problem solving.  Do these need different types of computer support ?

5.3.2.2.  The computer as a member of the team

- how should knowledge be divided between members of the team ?

- should the computer member of a team reason in a human way, or in a different way?  (Hollnagel, 1987),

- how should communication and responsibility be allocated and organised ?

- if a computer is a member of the team, do the operators think of it, and trust it, in the same way as they do a human member of the team ?  (Muir, 1987)


5.3.2.3.  Giving help

- how do people ask for advice ?  Presumably advice giving by computer should mirror the most effective human methods of advice giving.  Diaper (1986) has found that advice-asking conversations are relatively simple, suggesting that a sophisticated language-handling interface between questioner and advice giver is not necessary.  Falzon (1991) found that the questions people asked were a good clue to their level of expertise, and therefore the type of problems they were likely to be having with understanding, and the sort of advice they needed.

- how do people understand ?  O'Malley (1987) points out that there are several different types of 'help' question, such as : what is this ?  how do I do this ?  what if...?  why did this happen ?  what did I do wrong ?   Replying to each of these could involve a different knowledge base, and a different type of explanation.

        Or the other way round, advice from person to computer as team member :

        - if the computer comes to the wrong conclusion, how can the operator give it the information it needs to do better next time ?


5.3.2.4.  Organisational structure

If the operator is expected to intervene when necessary,  any one operator on any one shift may have a flexible and unpredictable mixture of activities, both high and low in complexity and responsibility.  This implies that there should be a flatter organisational structure, rather than a hierarchy.


As most of the studies quoted in this section have raised questions, rather than answering them, there is evidently a great deal of research to be done before computer advice givers can be used without advice.


The first version of this paper was prepared while the author was Visiting Research Fellow in the Ergonomics Workgroup, University of Twente, The Netherlands.  The author would like to thank Dr Ted White and his colleagues for providing an excellent research environment.




References database


[And sometimes - the computer overrides all attempts by the user to get it to do something different.  As in some of the word processing done by Blogger - I have tried several possible strategies for getting this left-aligned, without success !]




Access to other papers via the Home page


©  1998, 2022 Lisanne Bainbridge




Comments

Popular posts from this blog

Ironies of Automation

Types of skill, and Rasmussen's SRK schema

Complex Processes Review : References