Difficulties and Errors in Complex Dynamic Tasks

This is a discussion paper, written 1992, revised 1993, 1998.

It is another example of a late paper.  It is a topic I felt was important, but this was just a collation of my notes.


Section 2.1-2.3 re-iterate the usual points about the nature of complex behaviour.

Sections 2.4 - 4 do include points about difficulties and error sources.

As usual, these are long lists of the difficulties and complexities involved in dealing with this topic.


The reference list is incomplete, and I have now not got access to relevant material.


Topics :

1. Introduction.


2. Contextual cognitive processes.

2.1. Basic cognitive processes.

2.2. Organising the sequence of behaviour.

2.3. Dealing with unfamiliar situations.

2.4. Workload, team work.


3. Difficulties and errors related to cognitive processes.

3.1. The overview and behaviour organisation.

3.2. Interpreting the evidence and understanding the situation.


4. Brief notes on the implications for reducing error rates.

References




Difficulties and Errors in Complex Dynamic Tasks

unpublished discussion paper


Lisanne Bainbridge

Department of Psychology, University College London

1992




1. Introduction


Operators in complex dynamic tasks need to be adaptable to changing circumstances, and their understanding, planning, and organisation of behaviour are crucial to their effectiveness. It follows that an error categorisation scheme for these tasks needs to include difficulties and errors in the organisation of behaviour as a prime focus. 

As the basis for developing such a scheme, this paper outlines a contextual model of cognitive processing. This describes the way a temporary inference structure, which describes the current task situation, is built up in working storage and then provides the context for later task processing and for the organisation of behaviour.

The account of the model is followed by a discussion of workload effects. These concepts provide the framework and justification for the main categories of difficulty and error which are suggested. The paper ends with some notes on reducing error rates.


This paper follows Leplat 1990 and Reason e.g. 1990 in thinking of an error as behaviour which does not meet task requirements, and which is the outcome of usual cognitive processes which are ineffective in a particular situation, rather than as the result of special error mechanisms. In complex dynamic tasks the notion of error needs to be wider than it is in tasks in which there is a simple direct relation between situation and response. A person's behaviour in complex dynamic tasks can be described and assessed at several levels of detail, and may not meet the task requirements at any one or more of these levels. For example, the behaviour might meet the main task requirement (e.g. avoid nuclear radiation release) while being ineffectual or non-optimal in detail (Norros and Sammatti 1986). For this reason, while there are clear cognitive errors, in order for human factors/ ergonomics (HF/E) to help people doing these tasks it is necessary also to consider such wider notions as difficulty, failing, or weakness in cognitive processing, and these words will be used loosely to indicate operator inefficiency.


The terms 'task' and 'organisation of behaviour' also need explanation. 'Task' is a term with no standardised meaning, and there does not seem to be an alternative term which avoids confusion. In this paper 'task' refers to the whole configuration of three aspects. 

The first is the task and sub-task goals, the specification of what needs to be achieved, as described for example by Hierarchical Task Analysis (e.g. Shepherd 1985). At the highest level an operator's work may be a 'job', in the sense of an arbitrary collection of tasks. 

The second aspect is that in order to meet any of these task goals the operator may have cognitive goals (see 2.1.1). 

The third task aspect is the resources used by the operator in meeting the task and cognitive demands, such as external environment, equipment, coworkers, and inner expertise. 

So the term 'task' is used here loosely, unless it is useful to distinguish between these three aspects.


This paper focuses on complex dynamic tasks such as industrial process operation, naval mine sweeping, flying, air-traffic control, car driving, emergency services management such as fire or battle fighting, and various medical professions. (In this paper, most of the examples come from process operation.) Such tasks are cognitively demanding. A person doing this kind of task has to coordinate a larger number of different types of cognitive processing than are used in other tasks. This coordination is needed because these tasks have several key features, which are summarised in Table 1.


Table 1 : Key Features of Complex Dynamic Tasks, and their implications for the cognitive processes required.















Firstly, information may not be available about the state of some parts of the system. Information which is available about the state of an entity, and the effects of actions on it, is often ambiguous. These have two important effects. The person needs to build up a structure of inference to describe the current, and anticipated future, situation as they understand it. And in doing this they have to make decisions under uncertainty, so bias can enter.


Secondly, the person is expected to keep under control one or more independent dynamic entities, which will continue to behave in their own way over time even if no action is made on them. Also these entities may have several variables to be controlled. So the operator has several simultaneous task responsibilities, each of which may involve a hierarchy of sub-tasks. Because of evolution of the entity's behaviour over time, actions need to have the right size and timing to have the required effect. Also there may not be time to think out the choice between alternative actions before making a response. And because of the combination of evolution over time with multitasking, it may not be possible to complete one part of the task before starting on another, so tasks have to be interleaved, which puts an emphasis on the organisation of behaviour.


So the term 'organisation of behaviour' also has a configuration of meanings. 

One aspect is the organisation of the hierarchy of task and cognitive goals, and working methods, by which a task is done. 

A second aspect of the organisation of behaviour is the general allocation of resources between different goals. 

A third aspect of behaviour organisation is the process of deciding which specific sub-task from which main goal should be done next, at any particular moment (see 2.2). At a lower level of detail, this includes the allocation of attention (see 2.1.1 and Figure 2). If time has been available for planning, then decisions about the sequence in which to do tasks may have been made in advance. However when it is implemented, such a plan needs to be adapted to the details of the current context.


Also, most dynamic tasks are sufficiently complex for it not to be possible (at least in practice) to anticipate beforehand all the possible situations which might arise, and to pre-specify how to deal with them. So people doing this type of task need to be flexible, to adapt their behaviour to changing details of the situation, and to work out for themselves how to deal with unfamiliar situations.


Finally, these groups of features are interdependent, in that inferences, task organisation, and adaptability are all handled most effectively by building up and referring to an overview of the total task situation. In building up this overview, people decide what information to attend to, rather than just passively responding to stimulus changes.


The above features are discussed more fully in Section 2. These features have led to a change in models of how these tasks are done, from sequential to contextual models which emphasise this overview and the effective organisation of complex behaviour (see Section 2). Contextual models focus on the temporary inference structure built up in working storage to describe the task situation, and how this provides the context for later processing and the organisation of behaviour.

This focus on the organisation of behaviour leads to an inversion of the usual approach to human error, which starts with peripheral aspects and goes on to errors in more central processing. The error framework proposed here starts from weaknesses in the organisation of behaviour. Attention and perceptual decisions are considered, not as a first stage before more complex cognitive processes are involved, but as being done within the context of the current task understanding and planning. Understanding and planning lead to errors in attention and perceptual decisions, by affecting the biases involved (see 2.1.2).


All the features of complex dynamic tasks can lead to difficulties and errors. The aim of this paper is to provide a framework, focusing on the understanding and organisation of behaviour which are essential features of these tasks. It will not discuss the equipment related perceptual-motor errors which are listed in many other error analyses, although equipment design can have strong effects on the types of error which will be discussed. Also the main interest will be on the ways in which cognitive processes can fail, rather than on the external causes of these failures. In any real incident these external causes are many and diverse (see e.g. de Keyser and Nyssen 1993). This paper does not claim to give a complete error scheme. It gives a preliminary discussion, rather than a complete taxonomy, and it does not offer a method for identifying which category any particular difficulty or error is a member of, nor a method for predicting human error rates. The aim is to focus attention on human error types and processing difficulties which are not much discussed, as a stimulus to a more complete account.

The discussion in this paper will be in three main sections, on :

- a simplified model of the processes involved in doing complex dynamic tasks,

- cognitive errors and difficulties in these tasks, in categories related to the cognitive processes involved,

- brief comments on what this approach to modelling, and error categorisation, imply for reducing error rates.


2. Contextual cognitive processes


The most widely discussed processes underlying cognitive error are 'slips' and 'mistakes'. A person makes a 'mistake' if they do not choose the correct objective or how to meet it, and a 'slip' if they intend the correct action, but execution fails. These concepts were developed by Norman (e.g. 1981) to account for human error in habitual everyday activities. The types of difficulty with behaviour organisation which people have in complex dynamic tasks might be categorised as mistakes, but this would make the 'mistake' category an unwieldy one containing many different types of processing.


To get evidence about the nature of difficulties in task and behaviour organisation, it is necessary to study complex dynamic tasks themselves. Studies of serious events and accidents in these tasks are incomplete as a source of evidence, partly because the cognitive processes in these incidents are usually not studied (with some notable exceptions, e.g. Pew, Miller and Feeher 1981), partly because difficulties can have effects on a less dramatic scale. The data from incident studies need to be supplemented by evidence from studies which focus on cognitive processes in dynamic tasks. This section presents a simple model of the processes which such cognitive studies have shown are needed to account for cognitive behaviour in complex tasks. It will be in four subsections, on :


- the basic cognitive processes of building up temporary inference structures to describe the present situation and determine the best sequence of activity, 

- the sequencing of activity,

- how people deal with unfamiliar situations, 

- general factors, in particular workload and team work, which affect error rates. 

These topics provide the framework for the categories of error and difficulty used in Section 3.


2.1. Basic cognitive processes

Before discussing the key cognitive processes in complex dynamic tasks, it is useful to outline how they differ from other tasks which do not require the same processes.

For example, a medical receptionist responds to various patient and doctor requests. As several requests may occur at the same time, keeping track of the multitasking is a major feature of effective performance. However, it may not be possible to predict the arrival of task demands, nor to make inferences based on a mental model of the situation. Each task demand may be dealt with by a simple routine, and the tasks are not interdependent. So it may not be possible to, and the person may well not do the task more effectively, by building up a long-term overview of what is happening.


By contrast, when doing a complex dynamic task, people do not only react directly to external stimuli. They build up, in working storage, a temporary structure of inference which represents their understanding of the present and future situation, and their plans for what to do about it. These inferences are structured by the person's cognitive goals (see 2.1.1), and are built up from : 

- information in the environment, processed for its relevance to the task; 

- knowledge from the knowledge bases (see 2.1.3); 

- items already in working storage as a result of previous thinking. 

For a review of evidence for these processes, see Bainbridge 1992. (The term working 'storage' is used here, rather than working memory, to avoid implications that this is limited to the mechanism of the articulatory loop method of remembering. The word 'inference' may also be somewhat misleading, as there are many processes by which this temporary inference structure is built up.) For support for this type of model, see Bainbridge 1992.

























Figure 1. Cognitive processes and reference knowledge in complex dynamic tasks.


Figures 1 and 2 represent the basic model of processing used here. Figure 1 shows the two main components of cognitive activity: the structures in working storage, and the reference knowledge bases used in building up these structures. (For the present discussion it is not important whether or not there is parallel processing.) 

As the external world is constantly changing, so the temporary representation needs to be updated, by the cycle represented in Figure 2. 

These two Figures are a 'model' in the primitive sense that they reduce the features of complex reality to those which are central to the present discussion, and also in the sense that, for some people, a representation in a Figure is a 'model', while the same points made in prose are not.
















Figure 2. The cycle of processing.


In a complex dynamic task, it is usually necessary to make inferences, to interpret the situation, rather than to assume that the situation is only and exactly what can currently be sensed in the environment, because the information directly available is usually inadequate for understanding, or choosing an effective action. For example, it is not possible to see a leak in the primary circuit of a pressurised water nuclear power plant, because the circuit is enclosed by a thick concrete wall. Instead, a leak is inferred when there is a mismatch between the displayed flows into and out of part of the circuit.


In these tasks also, the task variance is such that it is often not possible to make an automated or standardised response to a situation. Planning may be necessary for at least two reasons. 

The optimum action may be one which takes into account expected future events, perhaps because they will give a natural solution to a present problem (for example, a change shortly due on a furnace will correct the present overuse of electric power by it), or in order to prevent an undesirable situation from developing. 

Secondly, in complex dynamic tasks there are usually several simultaneous goals to be met, and choosing a sequence of activity which is satisfactory by as many of these criteria as possible needs some predicting and comparing.


These processes of understanding and planning, of building up the temporary inference structure, make use of three types of mechanism, which will be discussed in sections on : 

- cognitive goals, 

- predispositions, 

- reference to stored knowledge bases.


2.1.1. Cognitive goals and information acquisition.

In complex dynamic tasks, people typically break down task goals into cognitive goals. It is not always necessary to describe responses to task demands as involving cognitive goals: for example if the response is automated (perceptual-motor skill) or proceduralized, or if a person rejects a goal they have been given. However, the process of building up a temporary inference structure in working storage, which represent the current situation in relation to the task goals, is mainly structured by cognitive working methods related to cognitive goals. Cognitive goals are an intermediary to meeting task goals. For example, the task goal : 'keep temperature at 300 degrees', involves the cognitive goals : 'find present temperature', 'evaluate present temperature against required temperature', 'choose corrective action' (these may not be consciously explicit or distinct to the person doing the task). Cognitive goals are not just trivial intermediaries, but are also a major aspect of how the person doing a complex task structures what they are doing (see section 2.2, and Bainbridge 1992, Sundstrom and Salvador 1991, Duncan 1990). In industrial process operation, typically the main cognitive goals are to : infer the present process state, review future events and states, review (future) product demands and plant constraints, evaluate states against demands, review actions available and their effect, choose appropriate actions, prepare an activity plan (Bainbridge 1992).


Cognitive goals are also involved in the way information is obtained. People acquire information from the environment in two ways, actively and passively. In 'active' information acquisition, people search for the information they need for what they are currently thinking about, to meet their cognitive goals. As part of this, people keep up to date with changes in the environment, depending on their confidence in the ongoing validity of the information they already have in working storage. The knowledge base, prompted by the current understanding, can suggest the information which it is appropriate to search for, and the assumed likelihood and importance of events in the environment which need to be checked on. 'Passive' information acquisition occurs when something in the environment overrides the attention processes being controlled by the person's current train of thought. Strong signals are usually used to cause an attention override, for example a warning buzzer, but this process can also happen by serendipity, when the person happens to notice an item of information that is relevant to a sub-task which they are not currently thinking about (Beishon 1969). Passive information acquisition is much affected by the salience of the information.


2.1.2. Predispositions.

There are several types of predisposition in human cognitive processing, which lead to a focus on particular interpretations or inferences. Things 'come to mind' in complex tasks by processes much richer and more varied than 'similarity' and 'frequency':


a. Decision making and bias. It is often the case in complex dynamic tasks that there is insufficient or ambiguous evidence about what is the present state of the situation, or what will be the effects of potential actions. So people doing these tasks have to make decisions under uncertainty. In conditions of uncertainty, even a mathematically optimal decision maker cannot be correct 100% of the time. And it is easy for biases, that is subjective rather than objective probabilities and costs, to enter into this type of decision making (Reason 1987). These subjective probabilities can be affected by the organisational culture, as well as by the personality and problems of the individual (Dixon 1976).


b. Human beings are also not always good at making use of new evidence in revising their interpretations and plans (e.g. Kahneman, Slovic and Tversky 1982). The reformulations of these ideas by Lopes (e.g. 1986, 1987) fit in with the approach taken in this paper, by emphasising the effects on preferences of the goals and strategies which people bring to decision making under uncertainty.


c. Information reduction is a way of handling complexity.

d. Processing heuristics do not cover all the possibilities but use 'best guess' methods to focus on what is most likely or useful.

d. Frames, scripts, scenarios similarly add inferences about what is relevant, or happening, or what to do in particular contexts.


2.1.3. Knowledge bases.

To make inferences and plans, to add to the information from the environment, people need reference knowledge about the entities they are working with, and about the methods which they use for dealing with various situations. A person may have a large number of interrelated types of knowledge for a particular task. For example, in industrial process operation the operator may have five general groups of knowledge, outlined below. These different types of knowledge have been distinguished because they may be optimally represented in different ways (Bainbridge 1988, 1993a).


1. These people may have 'empirical' or associative knowledge about how the process works, that is 'if x then y' knowledge acquired by experience, such as : 

- interface information (linked to) process state; 

- the information which will support or eliminate alternative inferences; 

- actions (linked to) the enabling conditions or actions needed before the action can be made; 

- actions (linked to) the effects which they achieve; 

- required effects (linked to) the actions with this effect. 

This knowledge may or may not be conscious and verbalisable.


2. Empirical knowledge may or may not be underpinned by explanatory knowledge, such as : 

- the physical structure, functional structure, causal structure, and behaviour over time of the process; 

- the mappings between these types of knowledge (for example, the relation between a change in the physical configuration of part of the process, as a cause, and its effects, and the time course of how these effects develop).


3. The third general grouping is knowledge about parameters of the process, for example : the product targets, plant constraints, and other criteria for optimising and compromising in the choice of behaviour, and the probabilities and costs or payoffs of states and events. Knowledge of these parameters does not come just from experience with the plant, but also from training and from the organisational culture.


4. The fourth general type of knowledge is about events on the process, such as : expected sequences of events, past incidents or cases, and the life history and character of the process.


5. The fifth main grouping is knowledge about how to operate the process. This includes : perceptual-motor skills such as perceptual patterns known to represent states in the process, or a 'feel' about whether everything is going right, or 'feel' skills in manually controlling the process. Operating knowledge also includes :

- specific task working methods (such as a sequence of process states and the physical actions for reaching them) which are used to meet the task goals; 

- cognitive working methods for thinking about meeting the task goals; 

- more general strategies for situations in which the same method cannot be used each time. 

Each of these working methods may have related meta-knowledge, such as the amount of time it takes, how risky it is, etc. This meta-knowledge is used in deciding whether a method is the best one to use in given circumstances (e.g. Bainbridge 1978, Valot and Amalberti 1991). These parameters of the working methods may come from personal experience or from the organisational culture. 

Finally, operators may know working procedures, that is, pre-specified instructions about what to do in given circumstances, which are not usually devised by the operator.


There may be categories, analogies, and similarities in any of these types of knowledge, so there is no simple single definition of similarity. Many of these knowledge types have been called the 'mental model' by one author or another (see Rogers, Rutherford and Bibby 1992).


2.2. Organising the sequence of behaviour

People doing a complex task usually have several simultaneous task responsibilities. For example, a process operator might have to operate or supervise a feed supply, a distillation column, and a heat exchanger, all at the same time. An emergency services manager may have to coordinate fire-fighters in the air and on the ground. Such operators also have several simultaneous and interdependent cognitive goals, for example understanding the present situation, evaluating what will happen in the future, and developing a method of coping with it. For example, Reinartz' (1989) team of operators were working concurrently on nine to ten goals. In addition these tasks are often sufficiently complex that any one configuration of situation details is not (frequently) repeated, and changes in the situation develop over time. All these factors mean that it is unlikely that someone can complete the working method for one part of their whole task before starting on another part of the task. People doing these tasks have to be flexible and adaptive to details of the present situation. They have to decide on a satisfactory sequence in which to do parts of the task. They have to choose how to switch between different task and cognitive goals, so that adequate action is taken, at the appropriate time, to keep the external situation acceptable. This allocation of effort between sub-tasks and goals is a key aspect of the effectiveness with which a complex dynamic task is done.


In continuous control tasks, this 'multi-tasking' involves more than the allocation of attention in sampling the many dimensions of the present state of the process. The operators have to update their cognitive goals, such as understanding the process, predicting the future, and planning. Planning may include both how to meet particular task goals, and devising an overall sequence of activity to meet several goals during the next time period (Beishon 1969).


The operators also have to integrate their actions with the evolution of changes in the process. For example, in a steelworks power control task (Bainbridge 1974), the operators cycled between checking acceptability of the process state and doing a section of task thinking. If, when they checked the plant state, there was a clear need for action, then they made an action, usually one which they had thought out beforehand. If the process state was unacceptable but action not urgent, they chose or refined the choice of their future actions. If the process state was acceptable, they reviewed what would happen in the future, and the implications for action. Whichever type of thinking they did, the detailed nature of their thinking depended on the results of previous thinking about this and other topics, where they had got to previously in the sequence of possible thinking about this topic, as well on the current state of the process. And, whichever type of thinking they did, how long they went on thinking about this, before they stopped and went back to checking the acceptability of the process state, depended on how acceptable the process state had been when they last checked it.


Beishon 1969, Reinartz 1989, and Amalberti 1992 present data from operators who are doing tasks in which the process has to pass through a sequence of states. For example, Reinartz studied a team of operators dealing with a major process fault, while Amalberti studied pilots during the sequence of phases of their flight. Such operators may have to work simultaneously on several task goals, such as controlling the dynamics of the aircraft while following air-traffic control instructions about safe position and heading. They may have a prior plan which they have to integrate with the expected evolution of the plant or aircraft behaviour, and with events which may be unpredictable in various ways, such as unexpected aspects of a plant fault or messages from air-traffic control, as well as with the exact nature (such as the size and timing) of expected events.


A way in which to account for this ability to sequence behaviour effectively is to suggest that operators choose what to do next (not necessarily consciously) on the basis of the context. And this context is provided not only by the current state of the plant but also, importantly, by the temporary structure of inference they have built up in working storage. This structure gives an overview of the present state of the task. This overview supplies data on the probability of, importance of, and constraints on, various events and activities, as well as evidence on the point the person had reached in their previous task thinking, and data to use in later task thinking.


Sequential and contextual models of behaviour organisation. 

The data on the nature of behaviour in complex dynamic tasks emphasise the importance of a particular type of cognitive model.


Different types of cognitive processing model may be adequate to describe cognitive behaviour in different types of task. In HF/E the most frequently used models of cognitive processes consist of a set sequence of processing stages, starting with reacting to information, and ending with action execution. Rasmussen's 'ladder' model (e.g. 1986) is an open-loop example in which some of the stages may be omitted. This model represents the cognitive processes found in a study (Rasmussen and Jensen 1974) of a non-dynamic task which did not involve control or multitasking [electronic equipment repair], and in which people predominantly used a context-free strategy, i.e. one in which an overview of the task situation is not necessary. So that task did not contain many of the key features of complex dynamic tasks identified in this paper. So it should not be surprising if that model does not contain components which can be extended to describe complex dynamic tasks. Both open and closed loop human operator models may be of the 'set sequence of stages' type. Such models cannot effectively describe cognitive behaviour in complex dynamic environments, because this behaviour does not occur in a simple perception-decision-action sequence. Essential features of complex cognitive behaviour are the multiple interdependent cognitive goals (aspects of either perception and decision), and the flexibility of the order in which these task topics are considered: any one aspect of cognitive processing may be done before or after any other. A competent person in this type of task does whatever aspect of thinking is appropriate to the current situation, which can be very varied, and they use active information search (perception within decision, to use those terms). This behaviour cannot be represented by a model which specifies the sequence in which subparts of the cognitive processing are done as one directional (e.g. arrows only from left to right). The behaviour is better modelled as being based on a temporary structure of inference which describes the present task situation and the proposed plan of action, with a mechanism for generating the sequence of behaviour as a function of this temporary inference structure (see Figure 2). (This distinction between sequential and contextual models is not the same as the difference between serial and parallel processing. Existing context models in ergonomics do not constrain which parts of the processing are done in parallel.)


The need for this change from sequential to contextual models of cognitive processing is discussed by Bainbridge 1989b, 1993b, 1997 and Hollnagel 1992. The terms 'sequential' and 'contextual' emphasise the difference between a 'set sequence of processing stages' compared with 'choice of processing focus as a function of context'. (It may be confusing that the term 'sequential' refers to the 'pre-specified sequence' type of model.) For the purposes of this paper it is only necessary to understand the general features of a contextual model, without details of how it might be implemented. Bainbridge 1992 and Hollnagel 1993 contain more detail about suggested versions of such a model.


2.3. Dealing with unfamiliar situations

In many complex tasks, particularly highly automated tasks, the operator may be expected to deal with situations which have not been anticipated. In most complex dynamic tasks there is the possibility that combinations of circumstances will arise which have not been met before. So the person does not have available a standard working method and/or reference knowledge, either for understanding what is happening or for choosing an effective action. They have to develop for themselves a working method and related knowledge, they have to problem solve. The aim here is to give a simplified account of this problem solving activity, which is sufficient to indicate the type of support needed. For the effects of unfamiliarity on workload, see 2.4.2.


The methods which a person uses to solve a problem (e.g. Shepherd 1989, de Keyser 1991) may include a mixture of asking local experts, reasoning by analogy, or using basic knowledge, such as to: 

- ask someone else with appropriate expertise; 

- think of a previous similar situation on the process, and adapt the working method used then ('case-based' reasoning); 

- think of an analogous situation, not on the process, to solve the problem on the analogy, and check if this solution applies to the real problem; 

- combine together parts from other methods into a new overall sequence;

- reason from first principles, perhaps to think of the required goal state, imagine a trajectory of states connecting the actual state to the required state, and then think of actions which will carry out each of these state transforms.


Whichever approach is used, if trial and error is not appropriate for testing the proposed method, then the person may mentally check the method to assess its effectiveness. If they have previous experience with part or all of the working method they should know something about it, such as how long it takes, how accurate a result it gives, how much mental effort it involves, how much risk, etc. So they can check this 'meta-knowledge' to test whether it fits the requirements of the present situation (Bainbridge 1978). This check may include personal as well as task goals : is the proposed activity interesting, amusing, helpful, exciting, etc.? If the person does not know much about the properties of a proposed working method, then they may imagine carrying it out, and check its imagined effects against the task and personal criteria.


Problem solving in practice thus may involve many factors which are often not included in theories of human problem-solving based on laboratory studies. Observation suggests that :

- reasoning from first principles is not necessarily the first strategy used; 

- multidimensional probabilities, costs and values may affect the choice of solution, not necessarily consciously; 

- much use is made of two groups of human cognitive processes, firstly categories, similarities, analogies and case-based reasoning, and secondly meta-cognition using knowledge of the properties of potential activities; 

- neither the starting point nor the 'solution', the final state required, are necessarily pre-specified in detail, for example, if the aim is to get a faulty high-risk plant 'into a safe state'; 

- the nature of the 'problem space', the structure of the environment and the facilities available, are not always fully specifiable in advance, for example in fire-fighting management the fire distribution, available appliances and work force are not exactly repeated, so it is only possible to develop prior working methods to deal with general categories of situation (though these can be effective, Samurçay and Rogalski 1988, Klein 1989). This illustrates the need for some of the knowledge items specified in Section 2.1.3, such as knowledge of previous incidents, categories, meta-knowledge, and general strategies.


2.4. Workload, team work

Error rates may be exacerbated by at least three factors. 

(1) Workload may increase errors, particularly inefficiencies in the allocation of time between parts of the task. 

(2) Team work can affect error rates due to interruption or distraction, biases in decision making, or the allocation of effort between parts of the task. 

(3) Equipment and environment design also have strong effects on error rates (indeed error rate is one of the main measures used by HF/E in assessing equipment), but this third group of factors will not be discussed here; for a recent review see Hollnagel 1993.


2.4.1. Cognitive capacity and mental workload.

Human cognitive processing can be limited in capacity. The essential relation for present purposes is that if mental workload is greater than mental processing capacity, then errors will increase. However, measuring the load and capacity is not simple. In physical tasks effort expended and amount of work done are the same, but this is not necessarily true in mental tasks. It is useful to distinguish between the task demands, that is, what the task requires should be done, and the mental workload involved in meeting these demands. This section will briefly discuss three topics : 

- factors affecting the relation between the task to be done and the mental processing capacity used, 

- the effects of time pressure, 

- unfamiliarity.


Workload and the task

Unfortunately it is difficult to put a number on human processing capacity requirements in real tasks, for at least three reasons, to do with equipment design, the working methods available, and other effects of expertise. Although this multitude of influences, on how much mental effort is needed to do a particular task, may make it impossible to quote absolute numbers when discussing mental workload, it is possible to point out factors which alter the rate of errors, and which should therefore be considered in design.


Operators use part of their mental effort do their main task, to understand and operate the process or aircraft, for example, and part of their mental effort to understand and operate the interface and job-aids they have been given to do the task with. The same task can require more or less effort, depending on the equipment used to do it. This is one of the fundamental reasons for HF/E.


If there are alternative working methods or strategies for doing a task, each of which requires a different amount of mental work to meet the same task demands, then there may not be a monotonic relation between task demands and mental workload (Sperandio 1972).


Expertise increases mental working capacity. The efficiency with which parts of the task are done increases with practice, indeed this is one definition of skill ['skill' in the general sense of level of expertise, not to label a particular type of cognitive processing]. People who do complex dynamic tasks, such as nuclear power station operators, air-traffic controllers or pilots, are often expected to have several years of experience before they are considered fully qualified to do the work. There are various changes in processing with extended practice. 

- Perceptual-motor skills improve and may become 'automatic', i.e. not using any of the limited type of processing capacity. 

- Cognitive skills develop, which consist of readily available working methods, including the appropriate predispositions and reference knowledge.

- Expertise affects how large a 'picture' of the current situation can be built up in working storage. The 'magical number 7 plus or minus 2' limit applies to the capacity of working memory in situations in which people without relevant experience have to remember arbitrary material. By contrast, for example, Bisseret 1970 found that experienced air-traffic controllers remembered on average 35 items about the situation they were controlling. With experience people learn the redundancies in the items to be remembered, and the whole structure of working storage, working methods and reference knowledge become interrelated and mutually reinforcing (Bainbridge 1989d).


So, with increased skill, the amount of mental work needed to achieve given task demands is reduced. Experts can therefore have more spare time, which they may use to plan ahead, which in turn makes them more efficient.


Unfortunately there are limits to these improvements. Over-learning, over-automatising of activities, can lead to rigidity in the methods used to do a task, and reduce the amount of checking done, and so to 'slip' errors (Leplat 1989). And the extended mental representation of the situation can break down when the person is interrupted or distracted.


Effects of time pressure. 

Complex dynamic tasks are essentially done within time limits, actions have to be made within a particular time to be successful. When the number of tasks to be done takes longer than the time available, then there are several ways in which error rates can increase.  Heuristics (see 2.1.2) may mean that fast efficient conclusions can be reached quickly. 

But within the components of a working method, there may be a basic speed-accuracy tradeoff in decision making or the execution of actions. 

The length of time allowed affects the amount of information which can be taken in, and the extent to which it can be processed. If decisions or actions are made quickly, then less information can be obtained or processing done, so there are more likely to be errors. 

When there is too little time for all the sub-tasks, some will have to be omitted. 

Finally, if all the available time is taken up with choosing the response to existing situations, there is no spare time to anticipate and plan for future situations (Hacker 1993). This means that responses to future events, when these events occur, will be made more slowly because the reaction has not been thought out beforehand, and made less effectively because there is less time to consider all the factors which could affect the optimum choice of action.


Unfamiliarity.

Complex dynamic tasks often involve dealing with unfamiliar situations. There are several important aspects of this in relation to workload.


1. An unfamiliar situation is one which the person does not have a previously practised working method and /or relevant reference knowledge for dealing with. So by definition, if expertise consists of having a readily available working method, etc., as suggested above, then expertise will be lower in unfamiliar situations. This will be the case even for people who are in general more expert in this task. Experts will be less likely to meet situations which they have not met before, and are more likely to have appropriate components of behaviour and knowledge which they can fit together to deal with the new situation, but it will be more demanding even for them, in needing problem solving, which is the most capacity demanding type of cognitive processing.


2. The task demands in an unfamiliar situation are likely to be greater. For example, an industrial process plant with a fault is more likely to be eventful, there is more likely to be more to be done in a given period of time. Also, if the person is using unusual working methods, the interface and job-aids are less likely to have been designed to support these methods, and so may interfere with the ease of doing the task.


3. Unfamiliarity or high levels of mental workload are themselves stressful, or 'arousing', at least for some individuals. And high as well as low levels of arousal can lead to reduced mental processing capacity (inverted-U relation between arousal and performance).


So mental workload in unfamiliar situations is likely to be higher, and it is important to design to support these activities. Figure 3 summarises the effects on mental workload and error rates which have been introduced in this section.



















Figure 3. Some factors affecting mental workload and errors which have been mentioned in the text.


2.4.2. Team work effects.

A team builds up the same sort of group understanding of the situation, and plan of action, as does an individual (see e.g. Reinartz and Reinartz 1989). This paper does not contain a full discussion of cooperative working, this section is just a brief reminder of some of the effects.


Team work can have considerable advantages in relation to errors. 

For example, teams often make effective error recovery mechanisms. People in the team may not all have the same predispositions, and so may be able to notice each other's errors more effectively than they can notice their own. This possibility is useful only if the social atmosphere allows people to comment on others, particularly on the activities of superiors. 

Teams can also be more effective than individuals because they can pool knowledge in building up understanding and action plans, and can distribute the workload between them.

However, people working together are likely to interrupt or distract each other, and so increase errors due to these factors. Social, cultural and organisational ingredients influence biases in decision making, and other predispositions. All the aspects which have for decades been studied by occupational psychologists, such as job-satisfaction, job involvement, socio-technical systems, the effects of group membership on performance, and the effects of organisational culture on attitudes to safety, risk and accident rates (which are now also studied in macro-ergonomics and the social anthropology of work), can have effects on error rates via effects on decision biases and predispositions. Team work also inherently involves organising the allocation of effort between sub-tasks, with attendant possibilities for confusions about responsibility, and therefore errors.


3. Difficulties and errors related to cognitive processes


The examples of cognitive difficulty and error which have been described in the literature can be grouped into categories as a function of the cognitive mechanism/ processing involved. The two main groupings used here are concerned with the person's overview and its effect on what to think about or do next, and with their understanding of the situation and what to do about it. Except at the beginning of work, these cognitive goals are interdependent. 

At first, at the beginning of shift or after a major unexpected event such as a fault, understanding must be built up, and it may be incomplete due to time pressure. 

After this, throughout the rest of the working period, understanding, planning and acting are interdependent. The results of any one aspect of thinking are kept in working storage, and provide the context for other thinking. Understanding determines what to do next, and that determines new understanding, as indicated in Figure 2.


The error scheme offered by Rasmussen (e.g. 1982) does not include any account of :

- the temporary inference overview built up in working storage, which provides the context and data for later, and more detailed, cognitive processing, 

- the organisation of behaviour in time, allocating effort to different parts of the overall task.

Reason (1990) mentions these factors, but they do not form a strong part of his GEMS framework. However these are two aspects which (as also argued by Hollnagel 1993) are central to behaviour, and therefore error mechanisms, in complex tasks.


Because understanding, planning, and action are interdependent, it is not always obvious which of the categories a given error or weakness in processing should be assigned to : this implies that multiple context factors are involved. Also, because all the cognitive processes are interdependent and simultaneously active, the choice of sequence in which to describe the 'error' types is necessarily somewhat arbitrary. The sequence used here has been chosen to emphasise the 'top-down' or 'inner-outer' character of the processing.


In this account, each category is based on the type of cognitive mechanism/ processing which has failed, and the discussion lists examples of errors and difficulties which can be attributed to this. This analysis concentrates on errors and difficulties while thinking about the task. The sort of factors which are usually the focus of error analyses, such as poor information sources, environmental quality, action execution, fatigue, shift work and other job organisation aspects, have not been included. The list of errors and difficulties is also incomplete. It is not possible to cover all the possible types of failing in all possible dynamic environments. This list aims to give examples of inefficiencies, especially from process operation, to illustrate the nature of the categories. 


And only illustrative references are given. In talking about these errors with people who study real complex dynamic tasks, I have found that most people can quote cases which they have observed, but which they have not mentioned in published papers because the cases seem to be anecdotal events which do not fit into any existing theoretical framework. Sometimes these events have been mentioned in research reports, so some of the references given here are rather obscure. Where a reference is not given for an error or difficulty, this example comes from the author's observation. One hope of this paper is that these types of error and difficulty will be given more attention in future, so that they become better understood.


3.1. The overview and behaviour organisation

Failings in goal formulation, and in the allocation of mental effort between sub-tasks, form an important category of errors and difficulties in dynamic tasks. Because a large range of error or difficulty types come in this category, it has been subdivided. The main category, on building up an overview of the task and sequencing the behaviour, is followed by a section on workload, on the effects of allocating too little time to a task. These aspects of processing are all affected by the present state of task understanding, and by assessments of the likelihood, importance and cost of events and actions, for which see 3.2.


3.1.1. The overview and allocation of effort between sub-tasks.

A major group of errors and difficulties that have been identified are all concerned with failure to form a general overview of the task, with the result that each sub-task is dealt with in isolation as it arises and is not considered within the context of the task as a whole.


Weaknesses in building up an adequate overview of the task, or integrating together the sub-tasks. 

These difficulties come in three groups :


a. difficulties in relating a sub-task to a wider whole, such as : 

- shifting between sub-tasks without relating them to the task as a whole ('thematic vagabonding') (Doerner 1980, 1987, Brehmer 1991), Roth and Woods 1988, Schaafstal 1989).

- reacting to events, rather than anticipating events and how to prevent or respond to them ('diminished planning horizon') (Hacker 1993).

- failure to relate information to underlying causes and explanation, when deeper understanding is needed (Brehmer 1991, Bainbridge 1984 ).


b. the inverse, weaknesses in clarifying the substructure of the task, such as : 

- wrong or inadequate breakdown of the task into sub-problems (Cellier, Eyrolle and Mariné 1992); 

- failure to devise intermediate goals (for example in fault diagnosis, not first narrowing down the problem to the general area of the fault, Patrick 1993).


c. specific difficulties in timing (for examples and discussion see e.g. de Keyser 1991), such as : 

- poor memory for temporal deadlines; 

- poor timing of actions (see also Norros and Sammatti 1986); 

- poor synchronising of time scales in different parts of the task.


Weaknesses in allocation of time between sub-tasks or working methods. 

a. staying with the part of the task the person knows how to deal with ('encystment') (Doerner 1987, Brehmer 1991),

b. acting on importance appropriate to irrelevant goal (In the steel works simulation studied by the author, Bainbridge 1974, one of the non-operator people observed was the person who wrote the process simulation. He concentrated on choosing the most elegant possible control solutions, while the actual control state was allowed to go wildly out of tolerance.)

c. not applying available strategies in a systematic way (Patrick 1993).


Weaknesses in allocation of attention. 

Failures in allocating attention to check the current state of the task may be due to biased (implicit) decisions about the relative importance of parts of the task, or to inadequate knowledge of process dynamics. These come in several groups :

a. attending to the required dimensions of the environment, but doing this inadequately, i.e. updating the working storage which is done with wrong or inadequate timing or is incomplete (e.g. Moray 1974).

b. not attending to all the relevant dimensions of the situation ('tunnel vision') (Bartlett 1943, Moray 1990, Runmar 1990).

c. not looking for the information which is needed to test or revise inferences about the state of the environment (Bainbridge, 1984).

d. not checking behaviour, for example : 

- not checking that the behaviour chosen is appropriate in the current environment ('slips') (Norman 1981, Reason and Mycielska 1992) ; 

- inadequate supervision of the effects of an action (Norros and Sammatti 1986, Brehmer 1991).


Team work. 

Similar failures in the organisation of task behaviour can occur in teams. These issues can be put into several groups :

a. failure to develop or implement a team overview of the task, such as : 

- lack of a team strategy (Norros and Sammatti 1986); 

- failure to communicate information, inferences, goals and plans, between members of a team (e.g. Rolt (Abermule e.g. chap. ) ). This can be exacerbated by a change of shift.

b. forgetting due to interruption or distraction.

c. inadequate allocation of, or perception of, responsibility, such as : 

- allocation of tasks to team members who have not got the time, expertise, or role, necessary (Doerner 1987, Rolt ); 

- poor supervision of inexperienced workers (de Keyser and Nyssen 1993); 

- shared, and therefore a diminished sense of personal, responsibility (Gall 1990); 

- blaming others for one's failures (Doerner 1987).


3.1.2. Workload effects.

Time pressure in general increases difficulties with task organisation and the allocation of resources. 

For example, if there is no time to anticipate, actions have to be chosen without considering the wider context (Hacker 1993). There may be increased conflicts in decisions about what is the most important thing to do next. 

And there may be increased difficulties in shifting from thinking about one part of the task to thinking about another, for example an air-traffic controller shifting between thinking about different groups of aircraft. This context shift is difficult because the person has, to some extent, to reconstruct or remind themselves of the temporary inference structure which represents their understanding of another part of their task, and doing this reconstruction may take time.


If the workload is such that not all the tasks can be done well in the time available, then behaviour organisation decisions must be made about which tasks to degrade or omit. So errors and difficulties of these types reflect assessments about the relative importance, probability and cost of different tasks (see also 3.2.1). These difficulties can be divided into three main groups, depending on whether time or work pressure has its effect via a reduction in the quality of each task done, or a reduction in the number of tasks done.

a. The quality of work done can be affected by reducing the amount of time allocated to each task. This increases failings such as : 

- reducing the accuracy to which discriminations are made, in effect increasing the 'filtering' of information; 

- making faster but less accurate actions, reflecting a shift along a speed-accuracy tradeoff.


b. Inversely, quality may be affected by changing the level of risk accepted, in order to do things more quickly. Examples are : 

- choosing high payoff high-risk actions; 

- making actions which are effective but which violate safety codes.


c. Coping with work pressure by leaving out some of the things to be done can take various forms, which could be put on scale of seriousness, such as : 

- delaying decisions in the hope of a future pause during which it will be possible to catch up;

- not processing certain categories of information, or parts of the task, for example 'rule rigidity' (inflexibility), or not considering special cases;

- arbitrarily leaving out some parts of the task; 

- abandoning the task.


Doerner 1987 has identified an 'Intellectual Emergency Reaction' in people who are under pressure. Their reactions may be quick, but they may also show ruthlessness, a diminished planning horizon, and a lack of concern for the side effects of their actions. These all represent a deterioration in the optimum structuring of the task. 

'Breakdown' is a reaction to work pressure which is so extreme that no effective task processing is done (Janis and Mann , Dixon 1976). Example forms of this are : 

- refusal to make decisions involving any uncertainty (see also Doerner 1987); 

- constantly cycling through the task demands without taking either cognitive or physical action about them; 

- a mental blank.


Forgetting

Failures in the temporary inference structures in working storage can occur because of failings in the processes which build up this structure, such as poor attention, poor structuring of goals, or poor coordination of timing. It can also occur because of failings in the mechanism of memory. The well known mechanisms by which short-term memories can be forgotten are :

a. overload of memory capacity;

b. decay, that is information fading in memory and needing to be refreshed;

c. interference, that is memory failures due to interruption or distraction ('capture' errors), such as : 

- a person forgetting why they started to do something; 

- a total loss of the 'picture', that is, of the task related structure of inference which has been built up over time to represent the situation related to the task demands.


3.2. Interpreting the evidence and understanding the situation

In complex dynamic tasks, the available evidence is often insufficient for it to be clear to the operator what is the underlying explanation of a situation, or what action would be most effective. In such situations a person has to decide between alternative hypotheses, and biases and predispositions can affect these decisions. Failings in the process of understanding have been divided into two main categories concerned with :

- choosing the best interpretation of the evidence, or plan of action. This includes using the knowledge base needed to suggest alternative hypotheses about the underlying explanation of the available evidence or effective actions to try, and also the additional evidence needed to test between these alternatives,

- inference, adding knowledge from the knowledge base to the interpretation.


3.2.1. Identification, interpretation and judgement.

Making an interpretation or plan involves choosing between alternatives. It is not always possible to be correct when deciding between alternatives under uncertainty. The incidence of wrong outcomes can be exacerbated by predispositions about the likelihood, similarity, or importance of the alternatives. These predispositions may be affected by the current task context, the present temporary inference structure, or the results of motivation, attitudes, and social, cultural and other aspects of the organisational environment. The factors have been divided into groups to do with effects which could be attributed to : 

- weaknesses in the cognitive processes of making and testing revising hypotheses; 

- incorrect judgements; 

- effects of salience, either in the environment or in the knowledge base.


Making and revising hypotheses. 

Weaknesses in the cognitive processes of making and revising hypotheses could be divided into three groups :

a. A typical weakness in human information processing is to focus only on positive evidence, not exploring disjunctive alternatives (Kahneman and Tversky, ).


b. Both choosing the original interpretation or plan, and revising the assumptions given new evidence, involve implicit decisions about what information is required (see also 3.1.1). 

- Failure to get or use all the relevant information before making a decision (e.g. Patrick 1993, Gall 1990) is called 'bounded rationality' by Doerner 1987 and Brehmer 1991.

- not using all the known alternatives in interpreting the situation, for example in fault diagnosis (Decortis 1992).


c. Once a person has decided on an explanation of what is happening, or a plan of what to do, they often have difficulty in revising this interpretation given new evidence. This revision is affected by the same sorts of factors as involved in the original interpretation. Some examples of difficulties with cognitive processes in this situation are : 

- staying with one interpretation or plan of action, although new information or lack of success suggest that it should be revised ('perceptual set', 'fixation error', 'cognitive lockup') (Doerner 1987, Brehmer 1991, Malone et al 1980),

- failure to change a plan, when an error or difficulty in carrying out part of the plan means that the rest of the plan is no longer an optimum course of action, 

- difficulties with revising procedures (Norros and Sammatti 1986, Bainbridge 1984).


Incorrect judgements. 

There is a large number of error types which can be attributed to poor subjective measurement, i.e. judgement, or to predispositions in identification. These errors reflect biases due to the current task understanding and planning or the personal or social context. These mis-assessments of probability and cost can lead to inappropriate allocation of effort between parts of the task (see 3.1). General errors of identification, such as 'inaccurate diagnosis', could be due to any of these factors. These error types have been grouped here into four categories :


a. Errors when the context, physical or social, makes the person predisposed to accept certain identifications. Examples would be : 

- ignoring a warning which is frequently appears as a false alarm; 

- the person seeing what they expect to see, rather than what is actually there (Davis 1966); 

- only noticing new information which is compatible with the existing representation of what is happening (Reason 1990); 

- failing to get information which involves checking on someone of higher status, or on a colleague who is a friend (a/c examples, e.g. Green RoySoc, Norman RoySoc); 

- failing to analyse procedures critically (Gall 1990).


b. Inaccurate assumptions about the behaviour of other people, such as : 

- incorrectly interpreting the intentions of others, e.g. a car driver incorrectly interpreting the intentions of other road users (Malaterre 1990).

- overconfidence that supposed or actual regulations will determine how other people behaviour (Malaterre 1990); 

- assuming that another team member has done something, so that some change in the situation has happened (Gall 1990, Rolt e.g. Abermule Chap ), for example assuming that the other shift has done something.


c. Errors in judging dimensions of the environment which cannot be directly perceived, and have to be estimated on the basis of past experience and hopes by oneself and other people, such as probabilities and costs. Examples of such errors are :

- misperception of risk (Brown and Goeger 1990); 

- when the significance of information not understood (Norros and Sammatti 1986); 

- a doctor not accepting that a patient is incurable (Boreham 1992).


d. Errors in estimating physical dimensions of the environment which are not instrumented and so have to be measured by judgement, such as : errors in speed, timing or distance perception (e.g. Brehmer 1991, Malaterre 1990).


Salience. 

Errors on both the allocation of time and in subjective measurement may be exacerbated by salience in either the environment or in the knowledge base.

a. People are more likely to make an error by accepting a wrong hypothesis if the evidence for it is more salient in the environment. Errors of this type may be caused by factors such as poor equipment design or environment which are not considered in this paper. Some examples of this type are : 

- assuming that interface evidence is valid (Malone et al 1980); 

- failing to get information because there is a cost on getting it, e.g. it is a long way away (Sinclair et al, 1966).


b. People are also more likely to make an error by accepting a wrong hypothesis if it is more obvious in the knowledge base, for example because it is available (used frequently and/ or recently), typical, representative, similar, plausible, important, or linked to other factors with these qualities (the 'halo effect').


3.2.2. Failures or weaknesses in knowledge referred to during cognitive processing.

There can be many difficulties in making inferences, that is, incorrectly adding to or extrapolating from the given information. For some purposes, simply identifying an error as due to inadequate reference knowledge does not differentiate between types of lack of knowledge which need different HF/E reaction in terms of support system or training design. Any of the types of knowledge listed in 2.1.3 might be wrong, inadvisable, fuzzy, incomplete, uneven or inelegant.


Examples of failings in the use of knowledge could be put into three groups :

a. failings because the knowledge base does not contain the relevant knowledge, such as :

- inadequate knowledge of all the alternative potential explanations of an existing situation; 

- inadequate knowledge of what further evidence is needed to test between these alternative hypotheses.

- failure to learn from experience.


b. inadequacies in meta-knowledge about alternative working methods, such as inadequate knowledge about which working method or strategy is most appropriate in which context (Patrick 1993).


c. failings in extrapolating from the knowledge base in a specific situation. Examples could be : 

- failure to anticipate or take into account the main or side effects of an event or action (Gall 1990, Cellier, Eyrolle and Mariné 1990, Reason 1990); 

- forgetting or not noticing special constraints on, or components of, familiar actions, which do not normally apply but which are relevant now (such as, in process operation, allowing for the fact that some parts of the plant are unavailable due to maintenance). 

- use of the wrong category in reasoning based on analogical or case-based reasoning;

- difficulties with mental models of dynamics, such as : 

- -  misinterpretation of delayed feedback, 

- - difficulties with allowing for exponential changes in variables.


4. Brief notes on the implications for reducing error rates

Hollnagel 1993 has also reviewed the evidence that human error processes are contextual, and he gives an interesting discussion of the implications of this view for the nature of ergonomic task analysis, and the processes of making a Human Reliability Analysis. There are many other possible approaches to reducing human error rates, which are suggested by the approach to error mechanisms in his book and in this paper and the categorisation of error types above. These approaches have previously been discussed elsewhere (see references below), and there is only space to give brief reminders here.


The many types of reference knowledge (2.1.3) lead to an emphasis on training. In cognitive skill, working methods are only easy to use, a temporary inference structure effectively built up, and reference knowledge is only structured relative to task thinking so that the knowledge is easy to access, if the cognitive activity is regularly practised. Learning knowledge separately from using it in the task is less successful. This implies that a generous amount of training is needed, in both initial and refresher training, not just perfunctory demonstrations. Different training methods are suited to different types of cognitive process, as indicated e.g. in Bainbridge 1989a. A proposal for a training scheme for process operators, which has the aim of integrating the different types of knowledge, is outlined in Bainbridge 1993.


As building up and maintaining a temporary inference structure, representing the current state of the task relative to the task and cognitive requirements, is an essential part of doing a complex task, this needs to be supported by equipment and job-aids design, and by training. As the ability to build up an effective overview develops with expertise, it would be interesting to study the design of training methods which are oriented to improving these cognitive processes. Also equipment interfaces need to be designed to support maintaining an overview of the state of the task, and so that this overview is easy to build up again if it is interrupted. Modern computer based interfaces, such as are used in industrial process operation and flying, only display part of all the potentially available information at any one time. It is important for such systems to be designed so that users can keep an overview of what is happening in parts of the task which are not currently displayed in detail (Bainbridge 1991a).


The temporary inference structure which is built up by someone doing a complex task successfully is not an unprocessed representation of the environment, but is the result of thinking about the task situation in relation to the task demands. So it takes time to develop. This needs to be considered in the design of, and allocation of function in, automated systems in which the operator is expected to take over operation (Bainbridge 1983, 1990).


Biases and predispositions are evidently important factors in error processes. There is some evidence that both individuals and teams can be trained to recognise the effects of bias on their behaviour, and then to allow for this in their future behaviour. This is potentially another area which would benefit from more research.


Although classical HF/E issues have not been discussed in this paper, the points made here do underline the importance of cognitive aspects in the ergonomic design of equipment, environment and job organisation. For example, the salience of information should map its importance in the task. And the design should require minimal extra mental work just to access, understand and use the interface, equipment and aids. One particular point is that equipment is often designed to support standard methods of working. However, this paper has emphasised that people doing complex tasks are flexible in their method of working. This suggests the importance of designing equipment so that it can be used for a variety of working methods.


All these approaches to HF/E design and training need to be based on an appropriate task analysis, to identify the knowledge, and task and cognitive goals, which need to be supported. Hierarchical Task Analysis (e.g. Shepherd 1985) is an appropriate tool, because it focuses on describing the hierarchy of task goals which gives each activity its larger context. Roth, Woods and Pople 1992 review methods of cognitive task analysis, and describe a simulation which builds and revises a situation assessment, equivalent to the overview described in this paper. However, a major gap in current ergonomics tools is an effective method for cognitive task analysis which focuses on cognitive goals.


The author would like to thank Simon Grant, Neville Moray, and the members of the Équipe Psychologie Cognitive Ergonomique, University of Paris 8, for their comments on earlier versions of this paper.




References


This list is not complete, see also the general references database.


AMALBERTI, R. 1992, Modèles d'activité en conduite de processus rapides : implications pour l'assistance á la conduite. Ph.D. Thesis, University of Paris VIII.

BAINBRIDGE, L. 1974, Analysis of verbal protocols from a process control task. In E.Edwards and F.P. Lees (eds) The Human Operator in Process Control (Taylor and Francis, London) 146-158.

BAINBRIDGE, L. 1978, Forgotten alternatives in skill and workload, Ergonomics, 21, 169-185.

BAINBRIDGE, L. 1983, Ironies of automation, Automatica, 19, 775-779.

BAINBRIDGE, L. 1984, Diagnostic skill in process operation. Proceedings of the 1984 International Conference on Occupational Ergonomics, Volume 2 : Reviews. May 7-9, Toronto, Canada, pp.1-10.

BAINBRIDGE, L. 1988, Types of representation. In L.P. Goodstein, H.B.Anderson and S.E. Olsen (eds) Tasks, Errors and Mental Models (Taylor and Francis, London) 70-91.

BAINBRIDGE, L. 1989a, Cognitive processes and training methods : a summary. In L.Bainbridge and S.A.Ruiz Quintanilla (eds) Developing Skills with Information Technology (Wiley, Chichester) 177-192.

BAINBRIDGE, L. 1989b, Types of hierarchy, types of model. In L.Bainbridge and S.J.Reinartz (eds) Proceedings of the Workshop on Cognitive Processes in Complex Tasks, TÜV Rhineland, Wilgersdorf, Germany, December, 75-85.

BAINBRIDGE, L. 1989c, Cognitive task analysis for process operations : discussion draft. Ergonomics Workgroup, University of Twente, The Netherlands.

BAINBRIDGE, L. 1989d, Development of skill, reduction of workload, In L. Bainbridge and S.A. Ruiz Quintanilla (eds) Developing Skills with Information Technology (Wiley, Chichester) 87-116.

BAINBRIDGE, L. 1990, Will expert systems solve the operators' problems ?

BAINBRIDGE, L. 1991a, Multi-plexed VDT display systems. In G.R.S.Weir and J.L.Alty (eds) Human Computer Interaction and Complex Systems (Academic Press, London) 189-210.

BAINBRIDGE, L. 1991b, Organising principles in hierarchies and models. In Proceedings of the Third European Conference on Cognitive Science Approaches to Process Control, 2-6 September, School of Psychology, University of Wales College of Cardiff, 81-90.

BAINBRIDGE, L. 1992, Mental models in cognitive skill : the case of industrial process operation. In Y.Rogers, A.Rutherford and P.Bibby (eds) Models in the Mind (Academic Press, London) 119-143.

BAINBRIDGE, L. 1993a, Planning the training of a complex skill, Le Travail Humain, 56, in press.

BAINBRIDGE, L. 1993b, Types of hierarchy imply types of model, Ergonomics, 36, 1399 1412.

BAINBRIDGE, L. 1997, The change in concepts needed to account for human behaviour in complex dynamic tasks, IEEE Transactions on Systems, Man, and Cybernetics, SMC-27, 351-359.

BARTLETT, F.C. 1943, Fatigue following highly skilled work, Proceedings of the Royal Society B, 131, 247-257.

BEISHON, R.J. 1969, An analysis and simulation of an operator's behaviour in controlling continuous baking ovens. Reprinted in E.Edwards and F.P.Lees (eds), 1974, The Human Operator in Process Control (Taylor and Francis, London) 79-90.

BISSERET, A. 1970, Mémoire opérationelle et structure du travail, Bulletin de Psychologie, XXIV, 280-294. English summary in Ergonomics, 14, 565-570.

BOREHAM, N. 1992, Error analysis and expert/novice differences in medical diagnosis. Department of Education, University of Manchester.

BREHMER, B. 1991, Human control of systems : a view from research with microworlds. In Proceedings of the Third European Conference on Cognitive Science Approaches to Process Control, 2-6 September, School of Psychology, University of Wales College of Cardiff, 159-188.

BROWN, I. and GROEGER, J. (eds) 1990, Special Issue : Errors in the operation of transport systems, Ergonomics, 33, 1183-1429.

CELLIER, J-M., EYROLLE, H. and MARINÉ, C. 1992, Expertise in dynamic environments. Department of Psychology, University of Toulouse at Mirail.

DAVIS, D.R. 1966, Railway signals passed at danger : the drivers, circumstances and psychological processes, Ergonomics, 9, 211-222.

DECORTIS, F. 1992, Processus cognitifs de resolution d'incidents specifiés en relation avec un modèle théorique. Ph.D. Thesis, Faculty of Psychology, University of Liège, Belgium.

DIXON, N.F. 1976, On the psychology of military incompetence. (Jonathan Cape, London).

DOERNER, D. 1980, On the problems people have in dealing with complexity, Simulation and Games, 11, 87-106.

DOERNER, D. 1987, On the difficulties people have in dealing with complexity. In J. Rasmussen, K.Duncan and J.Leplat (eds) New Technology and Human Error (Wiley, Chichester) 97-109.

DUNCAN, J. 1990, Goal weighting and the choice of behaviour in a complex world, Ergonomics, 33, 1265-1279.

GALL, W. 1990, An analysis of nuclear incidents resulting from cognitive error. Presented at the Meeting on Operating Reliability and Maintenance of Nuclear Power Plant, 8 March, Institution of Mechanical Engineers, London.

HACKER, W. 1993, Occupation psychology between basic and applied orientations : some methodological issues, Le Travail Humain, 56, (in press).

E. HOLLNAGEL, G. MANCINI, D. WOODS (eds.), 1988,  Cognitive Engineering in Complex Dynamic Worlds. (Harcourt Brace Janovich, London)

HOLLNAGEL, E. 1992, Coping, coupling and control : the modelling of muddling through. In Proceedings of the Second Interdisciplinary Workshop on Mental Models, 23-25 March, Robinson College, Cambridge, 61-73.

HOLLNAGEL, E. (1993).  I have not been able to pin down what this might refer to.  Here is Erik Hollnagel’s personal site, which lists his book and papers, and is rich with sources on the topics of this paper, and attempts to include these issues in practical techniques for reducing errors.

KAHNEMAN, D., SLOVIC, P. and TVERSKY, A. (eds) 1982, Judgement under Uncertainty : Heuristics and Biases (Cambridge University Press, New York).

de KEYSER, V. 1991, Temporal reasoning in continuous processes : segmentation and temporal reference systems. In Proceedings of the Third European Conference on Cognitive Science Approaches to Process Control, 2-6 September, School of Psychology, University of Wales College of Cardiff, 311-334.

de KEYSER, V. and NYSSEN, A.S. 1993, Les erreurs humains en anesthesie, Le Travail Humain, 56, (in press).

KLEIN, G.A. 1989, Recognition-primed decisions, In W.B.Rouse (ed) Advances in Man-Machine System Research, Volume 5 (JAI Press, Greenwich CT).

LEPLAT, J. 1989, Cognitive skills at work. In L.Bainbridge and S.A. Ruiz Quintanilla (eds) Developing Skills with Information Technology (Wiley, Chichester) 35-63.

LEPLAT, J. 1989, Relations between task and activity : elements of elaborating a framework for error analysis, Ergonomics, 33, 1389-1402.

MALATERRE, G. 1990, Error analysis and in-depth accident studies, Ergonomics, 33, 1403-1421.

MALONE, T.B., KIRKPATRICK,M., MALLORY, K., EIKE, D., JOHNSON, J.H. and WALKER, R.W. 1980, Human Factors Evaluation of Control Room Design and Operator Performance at Three Mile Island 2. NUREG/CR-1270 (Essex Corporation, Alexandria VA).

MORAY, N. 1990, Designing for transportation safety in the light of perception, attention and mental models, Ergonomics, 33, 1201-1213.

NORMAN, D.A. 1981, Categorisation of action slips, Psychological Review, 88, 1-15.

NORMAN, D.A. 1988, The Psychology of Everyday Things (Basic Books, New York).

NORROS, L. and SAMMATTI, P. 1986, Nuclear Power Plant Operator Errors during Simulator Training (Research Report 446, Technical Research Centre, Espoo, Finland.

PATRICK, J. 1993, Cognitive aspects of faultfinding - training and transfer, Le Travail Humain, 56, (in press).

PEW, R.W., MILLER, D.C. and FEEHER, C.E. 1981, Evaluation of Proposed Control Room Improvements Through Analysis of Critical Operator Decisions (Electric Power Research Institute, NP-1982, Research Project 891, Palo Alto, Calif.)

RASMUSSEN, J. 1982, Human errors: a taxonomy for describing human malfunction in industrial installations, Journal of Occupational Accidents, 4, 311-355.

RASMUSSEN, J. 1986, Information Processing and Human-Machine Interaction, an Approach to Cognitive Engineering (North-Holland, New York).

RASMUSSEN, J. and JENSEN, Aa. 1974, Mental procedures in real life tasks : a case-study of electronic troubleshooting, Ergonomics, 17, 293-307.

REASON, J. 1987, The psychology of mistakes : a brief review of planning failures. In J. Rasmussen, K.Duncan and J. Leplat (eds) New Technology and Human Error (Wiley, Chichester) 45-52.

REASON, J. 1990, Human Error (Cambridge University Press, Cambridge).

REASON, J. and MYCIELSKA, K. 1982, Absentminded ? The Psychology of Mental Lapses and Everyday Errors. (Prentice-Hall, Englewood Cliffs NJ).

REINARTZ, S.J. 1989, Analysis of team behaviour during simulated nuclear power plant incidents. In E.D.Megaw (ed) Contemporary Ergonomics 1989 (Taylor and Francis, London) 188-193.

REINARTZ, S.J. 1991, A multi-facetted examination of how teams cope with complex nuclear power plant incidents. Paper presented at the Third European Conference on Cognitive Science Approaches to Process Control, Cardiff, 2-6 September.

REINARTZ, S.J. and REINARTZ, G. 1989, Verbal communication in collective control of simulated nuclear power plant incidents. In Proceedings of the Second European Meeting on Cognitive Science Approaches to Process Control, 24-27 October, Siena, Italy, 195-203.

ROLT, L.T.C. - wrote on 18c and 19c engineering.  This refers to a book which includes an account of an early major railway accident, at Abermule.

ROTH, E.M. and WOODS, D. 1988, Aiding human performance, I : Cognitive analysis, Le Travail Humain, 51, 39-64.

ROTH, E.M., WOODS, D.D. and POPLE, H.E. 1992, Cognitive simulation as a tool for cognitive task analysis, Ergonomics, 35, 1163-1198.

RUNMAR, K. 1989, Knowledge and skills of operators in paper mills : a comparison between experts and novices. In Proceedings of the Second European Meeting on Cognitive Science Approaches to Process Control, 24-27 October, Siena, Italy, 149-161.

SAMURÇAY, R. and ROGALSKI, J. 1988, Analysis of operator's cognitive activities in learning and using a method for decision making in public safety, In J. Patrick and K.D.Duncan (eds) Training, Human Decision Making and Control (North Holland, Amsterdam).

SHEPHERD, A. 1985, Hierarchical task analysis and training decisions, Programmed Learning and Educational Technology, 22, 162-176.

SHEPHERD, A. and HINDE, C.J. 1989, Mimicking the training expert : a basis for automating training needs analysis, In L. Bainbridge and S.A. Ruiz Quintanilla (eds) Developing Skills with Information Technology (Wiley, Chichester) 153-175.

SINCLAIR, I., SELL, R., BEISHON, R.J. and BAINBRIDGE, L. 1966, Ergonomics study of an LD Waste-Heat Boiler control room. Journal of the Iron and Steel Institute, 204, 434-442.

SPERANDIO, J.C. 1972, Charge de travail et regulation des processus operatoires, Le Travail Humain, 35, 85-98. English summary in Ergonomics, 1971, 14, 571-577.

STAGER, P. and HAMELUCK, D. 1990, Ergonomics in air traffic control, Ergonomics, 33, 493-499.

SUNDSTROM, G.A. and SALVADOR, A.C. 1991, Principles for support of joint human machine reasoning : an example from supervision and control of future multi-service networks. In Proceedings of the Third European Conference on Cognitive Science Approaches to Process Control, 2-6 September, School of Psychology, University of Wales College of Cardiff, 269-280.

VALOT, C. and AMALBERTI, R. 1991, Metaknowledge for time and reliability, Reliability Engineering and Systems Safety, 36, 199-206.




Access to other papers via Home page


©1998, 2022 Lisanne Bainbridge




Comments

Popular posts from this blog

Ironies of Automation

Types of skill, and Rasmussen's SRK schema

Complex Processes Review : References