English| 中文| Sitemap| Contact Us
Home
About Us
Our Schools
Courses
News / Events
Alumni
Publications
Insights
Ask SAA
 
 
Home  >    Insights
 
 
 
 
 
 
Bookmark and Share Facebook SAA Course RSS Print
Insights


   
The Human Factor: A Problem to Control or Solution to Harness?


Throughout the twentieth century, we have always been in two minds about the role of humans in safety. One view is that of the human as a problem to control. Before World War II, an Oxford psychologist by the name of Vernon studied 50,000 personal injury accidents and concluded that they “depend, in the main, on carelessness and lack of attention of the workers”. “The human factor” at that time was thought to be characteristics specific to the individual. In other words, the “physical, mental, or moral defects” that predispose certain people to accidents. These negative characteristics could be identified by testing and screening people.

Psychology was dominated by behaviourism during this time, which aims to influence human behaviour so that it fits the constraints and demands of the system people work in – instead of asking why people do what they do, because it has no models of the mind to explain any of that. Behavioural intervention assumes that with the right incentives and sanctions, people can be engaged in safety.

World War II changed this. Technological developments were so rapid and complex that no amount of intervention toward the human worker alone could solve safety problems. Practically all Army Air Force pilots, for example, regardless of experience and skill, reported errors in using cockpit controls. Instead of trying to change the human, engineers realised that they could, and should, change the technologies and tasks to make error and accident less likely. “It should be possible to eliminate a large proportion of so-called ‘pilot-error’ accidents by designing equipment in accordance with human requirements,” psychologist Paul Fitts observed at the time. Human factors were no longer seen as just the cause of trouble — humans were the recipients of trouble, which couldbe engineered away, to some extent. This is how we got to understand human factors in the post-war period: What happened was that in the last decades of the twentieth century, experts dealing with mechanisation in many settings all moved away from a focus on the careless or cursed individual who caused accidents. Instead, they now concentrated, to an extent that is remarkable, on devising technologies that would prevent damage no matter how wrongheaded the actions of an individual person, such as a worker or a driver (Burnham, 2009, p. 5). The psychology that was called on was very different from behaviourism. It was a psychology of cognition, a science seeking to understand the rationales behind people’s actions. To this end, models of mind on attention, perception, decision making and information processing, were developed. Technologies and work environments could then be designed to take into account what we had learned about human performance and limitations.

Human Factors Today: Practical Experience and Expertise, but also Protocol and Bureaucracy

How do we relate to these ideas in safety engineering today? The instability between them is still visible, but we have learned, through a number of high-visibility negative events, that people are a resource to harness; that expertise and practical experience matters. In the wake of the Space Shuttle Columbia accident in 2003, for example, NASA was told it needed “to restore deference to technical experts, empower engineers to get resources they need, and allow safety concerns to be freely aired”. Similarly, prior to the Texas City refinery explosion in 2005, BP had outsourced refining technology work, causing many experienced engineers to leave.

Deference to expertise means engaging those who are adept at recognising risks and anomalies in operational processes. These may be the workers who are in direct contact with the organisation’s safetycritical processes. While this has become a well-established prescription in research on high-reliability organisations and resilience, experts do not always get it right either. Research has identified limits on experts’ privileged knowledge of safety-critical processes and safety margins. Continued operational success, for instance, can be taken as evidence by experts that risk-free pathways have been developed, and exceptional expert competence is often associated with taking greater risk.

More Rules and Tighter Limits on the Human Factor?

Does the solution then lie in setting tighter limits on the human factor and seeing people as a problem to control? More rules in aviation, observed through safety research by Rene Amalberti recently, do not seem to contribute to more safety: “The rate of production of new rules in aviation is significantly increasing while the global aviation safety remains for years on a plateau at 10-6 (over 200 new policies/guidance/ rules per year). Since nobody really knows what rules/materials are linked to the final safety level, the system is purely additive, and old rules and guidance material are never cleaned up.”

Increasing regulation and standardisation and systematisation have paid great safety dividends during the twentieth century, and we should not let all that go. At the same time, major accidents such as Swissair 111 and Piper Alpha have shown that it can be those who violate rules whosurvive such emergencies, whilst those who obey die. In an increasingly changing and complex industry, it may be helpful to think about the following questions:

  • Are you asking who is responsible for the safety problem, or what is responsible? If your answer is the former, you likely see people as your problem to control. These people may, however, be the recipients of trouble deeper inside your organisation.

  • What is the balance in your organisation between intervening at the level of people’s behaviour (which might assume that tools and tasks are fixed so people need to be “fitted” to them) and intervening in people’s working conditions and equipment (which suggests that environment can be shaped to fit your people better)? If capital investments were made in equipment that might last a few more decades, there may be no choice. But your lessons can impact the design of the next generation of equipment.

  • Are you measuring safety mostly as an absence of bad events or looking for the presence of positive capacities in your people, team and organisation? These include the capacity to question continued success (and not see it as a guarantee of future safety), the capacity to say “no” in the face of acute production pressures, and the capacity to bring in fresh perspectives on a problem and listen to the “voice from below”.

  • Are your safety policies mostly organised around limiting, constraining and controlling what your people do? Or do they actually empower them, encourage sharing and invite innovation?

  • When you see someone doing something unsafe, do you just tell him not to do it? Or will you try to understand how he rationalised his action? If his action made sense to him, it will probably make sense to others as well. That points to systemic conditions you should look at.

  • When you see a gap between how your people work and what the rule tells them to do, do you call that a “violation”? If so, you have prematurely decided who is right, and might not learn anything new. On the flip side, you could see the same gap as “resilience”, whereby your workers are finishing the design of a procedure or an equipment because that design wasimperfect to begin with. Your workers may have well recognised and adapted to situations that fall outside of what you have designed or trained for.

  • Are you telling your people that you will lead them into safety and making them risk averse? Or are you honest about actually leading them into danger each day, and your wanting them to be risk competent?

By Sidney Dekker
Professor, Safety Science Innovation Lab
Griffith University, Australia
Article was first published in The Leading Edge issue 02/2014


 

 
 
 
Singapore Aviation Academy, a division of the Civil Aviation Authority of Singapore.
© 2016 All Rights Reserved.
CAAS | Changi Airport Conditions of Access | Privacy Policy