Customer center

We are a boutique essay service, not a mass production custom writing factory. Let us create a perfect paper for you today!

Example research essay topic: Problem Solving Human Error - 1,438 words

NOTE: Free essay sample provided on this page should be used for references or sample purposes only. The sample essay is available to anyone, so any direct quoting without mentioning the source will be considered plagiarism by schools, colleges and universities that use plagiarism detection software. To get a completely brand-new, plagiarism-free essay, please use our essay writing service.
One click instant price quote

How Do People Contribute To The Catastrophic How Do People Contribute To The Catastrophic Breakdown Of Complex Automated Technologies? As scientific knowledge progresses and technological advances are made, greater dependence is placed upon automated systems and their complexities are, necessarily, increased. Whilst the systems themselves may be rigorously tested to ensure they operate correctly, errors can enter the system via the weak link in the chain the human designers and operators. Unlike the machines that they operate, humans are not very good at doing the same task for a prolonged period, or at doing two things at once, and their performance becomes impaired if asked to do so, e. g.

Castle & Wherewith 1984. Human errors therefore become almost an inevitability in a complex system and this has lead to much research into the causal factors behind errors and new ways of implementation to minimise their occurrence. Reason (1990) distinguishes between two types of error; latent errors, problems caused by poor design or implementation at a high level which may not be immediately apparent, and active errors, errors caused by front line operators which are often inherited from latent errors, although the consequences here are usually seen on site and are more immediately apparent. Latent errors are the more serious category for complex automated systems as they may not be apparent at the initial onset of system implementation and can lie dormant until triggered by an active error (giving rise to the pathogen metaphor). As Reason observes, these errors constitute the primary residual risk to complex, highly-defended technological systems.

Errors may also be exacerbated by the increasing opacity of automated systems, and this theme is central to the issue of automated systems breakdown. As automated systems become more complex, the human operators become increasingly distanced from the actual processes and lose their hands-on knowledge of the system. Such distancing and complexity of function can lead to mode errors the human operators perform the appropriate action for one mode when they are in fact in another, e. g. the pilots of an Aero Mexico DC- 10 made a mode error in using the autopilot, causing the engine to stall in mid-air and damaging the plane. (Norman 1983). Such complicated systems as these are often safeguarded by features in them which are designed to accommodate errors without breakdown, though ironically these systems often tend to exacerbate the problem.

Automated systems frequently have inbuilt defence mechanisms for errors and can compensate for them. Often, such systems may operate on the defence in depth principle in which a system incorporating a hierarchical structure of processes will have corresponding defence at each level. In such cases, an active error by the operator may be subjected to attempts by the system to compensate at several levels, only returning to the operator as a last resort if the error cannot be fully compensated for. However, this mechanism is often unseen by the operator who may be unaware that an error has even occurred until compensation is no longer possible and the system breaks down. At this point the error may have been compounded by the systems attempts to cope with the situation and be of a much larger, more complex and more obscure nature than when first encountered. Also the delay by the system in informing the operator of the error may well have caused the system to go beyond the point where the operator is able to save it.

Such a situation is cited by Norman (1990) in which the loss of power to one aeroplane engine is compensated for by the autopilot until such compensation is no longer possible and it is too late to prevent the plane from rolling. Norman argues that in such cases automation is not the problem, rather the inappropriate level of feedback given by the system. To consider a similar scenario to the one outlined above, we can envisage problems even if the system informs the operator of the problem whilst it is still possible to act. Humans have the unique ability to use knowledge based problem solving routines on novel stimuli, as would be returned by a machine faced with an error which its programming does not teach it to cope with. However, this very ability of the human operator is not an optimal one, especially when the individual is under a stress situation, as would be the case if the automated system was one which could have catastrophic consequences from an error, for example an air traffic controlling system.

The operator is likely therefore to be under considerable pressure to produce a solution and this is likely to interfere with already less than perfect heuristic problem-solving techniques. In attempting to match the situation with previously experienced ones (a technique known as similarity matching) and thus use previously successful solutions it is quite likely that the individual will distort the problem space and arrive at a solution which does not fully meet the requirements of the problem. Here again then, we see that the strategy of using the ill-informed operator as a last resort in the rectification of errors which have possibly been made more complex by the systems own attempts to correct then is a seriously flawed and potentiality catastrophic one. Reason (1990) has highlighted another area where humans can cause the breakdown of automated systems, and this is in the field of violations.

Reason outlines intentionality as being the differentiation between errors and violations. Within the violations category, routines are a consequence of the natural human tendency to take the path of least effort. Thus, the problem here is not a sub-conscious mistake but a decision taken by the individual in response to an indifferent environment in which such practices can go unnoticed. Daily safety violations were made for a long time leading up to the Chernobyl disaster. Such human factors as sloppiness of procedure, mis-management and the practice of placing economic considerations above safety can all contribute to system failure. For example, in the Bhopal tragedy, the staff were insufficiently trained, the increased reading of a pressure gauge was not seen as abnormal and factory inspectors warnings were ignored.

Clearly with such huge latent violations in procedure the disaster was not entirely unpredictable. (Stix 1989. ) Exceptional violations are those in which the operating circumstances make them inevitable, for example in the Zeebrugge ferry disaster it was first thought that the blame rested with the crew member who did not close the bow doors, though later evidence that bad time management and lack of checking by higher orders were at fault. Again though, the less than optimal performance of the people running the operation is seen to be the root cause of breakdown. Human errors are things which will always occur. Attention lapses, performance limitations, slips etc.

are all far too unpredictable to ever be eliminated altogether and so perhaps the aim of systems designers should be to minimise the effects of errors and to maximise their early detection. Automation is an area in which efficiency can be greatly improved, safety standards increased and economies made, though as Weiner & Curry (1980) observe, attempting to remove errors by automation is a flawed idea in itself, since humans will be monitoring the systems and thus the errors are merely relocated. In conclusion to this study, it is perhaps ironic to note that with the continued implementation of more advanced technologies, humans are increasingly assigned to the role of monitors. Here then we see ourselves falling into a situation where each half of the system is engaged in doing what the other half does best computers are excellent at repeatedly performing mundane and tedious tasks without getting distracted whilst humans have very limited attention spans and become bored very easily.

As Bainbridge (1983) points out, vigilance studies have shown that it is impossible to maintain effective visual attention on a source of information on which very little happens for more than half an hour. Perhaps if more emphasis were placed on implementing technologies to monitor the performance of humans rather than the reverse, accidents such as Chernobyl and Bhopal may be avoided. Bibliography Bainbridge, L. (1983) Ironies of automation. Automatica, 19 (6) pp. 775 - 779. Eysenck, M. W. & Keane, M.

T. (1991) Cognitive Psychology. Lawrence Erlbaum Norman, D. A. (1983) Design rules based on analyses of human error. Communications of the ACM, 26 pp. 254 - 258. Norman, D. A. (1990) The problem with automation: Inappropriate feedback and interaction, not over-automation.

Reason, J. (1990) Human Error. ch. 7. Cambridge University Press Stix, G. (1989) Bhopal: A tragedy in waiting. IEEE Spectrum 3 b 9


Free research essays on topics related to: human error, e g, error, problem solving, errors

Research essay sample on Problem Solving Human Error

Writing service prices per page

  • $18.85 - in 14 days
  • $19.95 - in 3 days
  • $23.95 - within 48 hours
  • $26.95 - within 24 hours
  • $29.95 - within 12 hours
  • $34.95 - within 6 hours
  • $39.95 - within 3 hours
  • Calculate total price

Our guarantee

  • 100% money back guarantee
  • plagiarism-free authentic works
  • completely confidential service
  • timely revisions until completely satisfied
  • 24/7 customer support
  • payments protected by PayPal

Secure payment

With EssayChief you get

  • Strict plagiarism detection regulations
  • 300+ words per page
  • Times New Roman font 12 pts, double-spaced
  • FREE abstract, outline, bibliography
  • Money back guarantee for missed deadline
  • Round-the-clock customer support
  • Complete anonymity of all our clients
  • Custom essays
  • Writing service

EssayChief can handle your

  • essays, term papers
  • book and movie reports
  • Power Point presentations
  • annotated bibliographies
  • theses, dissertations
  • exam preparations
  • editing and proofreading of your texts
  • academic ghostwriting of any kind

Free essay samples

Browse essays by topic:

Stay with EssayChief! We offer 10% discount to all our return customers. Once you place your order you will receive an email with the password. You can use this password for unlimited period and you can share it with your friends!

Academic ghostwriting

About us

© 2002-2024 EssayChief.com