50
Ultimately, it requires strenuous efforts to
pinpointwhenandwhereerrorsarelikelytohappen.
Thetypeoferrorsandprobabilityofhumanerrorto
occurcanbefoundthroughcarefulanalysisoftasks
and system requirements. This will yield designers
andtrainers information of which specific tasks and
system
characteristics need fortification. This
proactiveapproachtohumanerrorisvaluableforthe
maritimeindustry(withmuchcompetitionandscarce
resources) considering the cost of consequences,
despite the efforts needed to implement measures
againsthumanerrors.
Experts and novices are both prone to errors.
Experience is an essential part of
expertise, and the
roadtobecomeanexpertinvolvesdevelopingmental
schemas.Theschemashelptheoperatorbyreducing
the time taken to recognize situations and to make
decisionsandcorrectiveactionsaccordingly(Naziret
al. 2013). Experts have sophisticated ways to
subconsciouslyknowwhattodo–oftencharacterized
by
experts telling that “they just know”. Their
schemas allow them to understand situations
triggered by small, subtle cues within the
environment. As opposed to experts, novices have
mentalschemasthatarelesseffective,thusrelyingon
more attention and cognitive resources to perceive,
understand, and predict the same situation. This
differencemanifestsintheantecedentsrelatedtothe
errors conducted in complicated situations: Where
experts can perceive subtle environmental cues to
understand the situation while monitoring, novices
must pay closer attention to catch the same cues.
Experts; who uses less attention and rely on mental
patterns, can be misguided when perceiving
or
interpreting environmental cues, consequentially
making a poor decision and action. Novices are less
likely to make the same mistake as they pay more
resourcestotheenvironmentandinterpretsthecues
moreconsciously,butthismakesnovicesmoreprone
tooverload,whichtherefore, makesthemignorantto
importantenvironmental
cuesaboutthesituation. In
complex maritime operations, understanding these
characteristics are of paramount importance to
effectively implement measures that reduce the
probability and mitigate consequences of human
errors.
Pilotage is a renown complicated pilotage
operation(SharmaandNazir2017).Consideringthe
dynamic nature of pilotage operations, i.e. that the
safest
situationoftenistokeepgoing,putspressureto
continuouslymaintainsituationawareness.Lossofit,
byforinstancethemechanismsdepictedabove,may
result in an accident. There are many examples of
accidents during pilotage operations, e.g. Godafoss,
Federal Kivalina and Crete Cement accidents
(Accident Investigation Board 2010a; Accident
Investigation Board 2010b; Accident Investigation
Board 2012). To assess human reliability in an
operation, one must understand the operation itself.
Thus,nextadepictionofagenericpilotageoperation.
Pilotage operations can be broken down to eight
main tasks: Order and get the pilotaboard, develop
group relationship, installing the pilot,
assess
environment and weather, decide route, supervise
navigation, coordinate tugboats and berthing
(Ernstsen et al. In Press). Developing group
relationshipand assessingenvironmentandweather
are non‐sequential continuous tasks, while the other
tasksareusuallyperformedinthesequenceshownin
Figure1below.
Figure1.Timelineoftasksinpilotageoperation
Pilotage operations are dynamic with many
interdependenttasks.Italsoconsistsof much subtle
and non‐transparent feedback from the system,
makingitmorechallengingandmentallyintensiveto
perceive, assess, understand, and decide the proper
course of action. For instance, radar with unprecise
settingsmaydetectnoisewhichcanbe
bothwavesor
fishingvesselstoanuntrainedeye.Thus,operatorsin
pilotage operations are heavily dependent on
individualskillsandknowledgeoftheoperation,as
wellasefficientcollaborationtosuccessfullybring the
vessel to berth or out of the port. This complexity
gives much potential to do human
errors, which
emphasizestheneedtounderstandthenatureofsuch
errors.
Human error research vastly increased after
complexaccidents inthe70sand80s,e.g.Three‐Mile
Island and Chernobyl. The focus changed from
technical malfunctions to acknowledging the role of
human factors. After this, accident investigations
began to look
for errors caused by human
components,eitheritbeingfoundatthesharp‐orthe
blunt end. Error research became popula r, and as a
consequent,manytheoriesweredevelopedaccording
to how it is conceptually applied, e.g Rasmussen
(1983); Reason (1990); Sanders and Moray (1991);
Wickensetal.(2015);Woods
etal.(1994).
Hollnagel(2000)attemptedanovelviewoferror,
looking at errors as contextual factors influencing
(normal) performance variability and dictates one
need to understand how these factors influence
behavior to understand how situational changes
impactperformancevariability(asopposedtocoinit
“human error”). As mentioned, pilotage operations
are complex, dynamic with a multitude of
interdependent tasks. This dictates a need to
understand which environmental circumstances
affectshumanreliabilitytoallowpinpointedtraining
anddesignalterations.
Human reliability is the positive orientation of
humanerror.Humanreliabilityassessment(HRA)is
a broad name for ways to find and
predict human
errors in a system. The increase in human error
research have resulted in a high number of various
humanreliabilityassessmentmethods,andmostcan
be divided as quantitative or qualitative approaches
tounderstandandpredicthumanerror.Forinstance,