809
139. Flach, J.M., Situation Awareness: Proceed with
Caution. Human Factors: The Journal of the Human
FactorsandErgonomicsSociety,1995.37(1):p.149‐157.
140. Heft, H., The Relevance of Gibson’s Ecological
Approach to Perception for Environment‐Behavior
Studies,inTowardtheIntegrationofTheory,Methods,
Research,andUtilization,G.T.Moore
andR.W.Marans,
Editors.1997,SpringerUS:Boston,MA.p.71‐108.
141. Rasmussen, J. and K.J. Vicente, Coping with human
errorsthroughsystemdesign:implicationsforecological
interface design.International Journal of Man‐Machine
Studies,1989.31(5):p.517‐534.
142. Rasmussen, J., Information Processing and Human‐
Machine Interaction:
An Approach to Cognitive
Engineering.1986:ElsevierScienceInc.228.
143. Vicente, K.J. and J. Rasmussen, Ecological interface
design: theoretical foundations. Systems, Man and
Cybernetics, IEEE Transactions on, 1992. 22(4): p. 589‐
606.
144. Smith,G.F.,Representationaleffectsonthesolvingof
anunstructureddecisionproblem.IEEETransactionson
Systems, Man,
and Cybernetics, 1989. 19(5): p. 1083‐
1090.
145. Flach, J.M., et al., Interface Design: A Control
Theoretic Context for a Triadic Meaning Processing
Approach, in The Cambridge Handbook of Applied
Perceptual Research, H. Robert, et al., Editors. 2015,
CambridgeUniversityPress.
146. Bennett,K.B., Ecologicalinterface design and system
safety:
One facet of Rasmussenʹs legacy. Applied
Ergonomics,2017. 59,PartB:p.625‐636.
147. Flach,J.M.,TheEcologyofHuman‐MachineSystems:
A Personal History, in Global Perspectives on the
Ecologyof Human‐Machine Systems, J.M. Flach, et al.,
Editors.1995,LawrenceErlbaumAssociates:Hove,UK.
p.1‐13.
148. Jenkins,D.P.,etal.,CognitiveWorkAnalysis:Coping
withComplexity.2008:Ashgate.
149. Jenkins, D.P.D., G.H.D. Walker, and N.A.P. Stanton,
CognitiveWorkAnalysis.2012,Abingdon,GB:Ashgate.
150. Read,G.J.M.,etal.,Designingatickettoridewiththe
Cognitive Work Analysis Design Toolkit. Ergonomics,
2015:p.1‐21.
151. Naikar, N., Cognitive work analysis: An influential
legacy extending beyond human factors and
engineering. Applied Ergonomics, 2017. 59, Part B: p.
528‐540.
152. Hilliard,A. and G.A. Jamieson, Representing energy
efficiency diagnosis strategies in cognitive work
analysis. Applied Ergonomics, 2017.59, Part B: p. 602‐
611.
153. Stanton, N.A.,
et al., Cognitive Work Analysis:
Applications, Extensions and Future Directions. 2017:
Taylor&FrancisGroup.
154. Vicente, K.J., Cognitive work analysis: Toward safe,
productive, and healthy computer‐based work. 1999,
Mahwah,NJ:LawrenceErlbaumAssociatesInc.
155. Hollnagel,E.,TheDiminishingRelevanceofHuman‐
Machine Interaction, in The Handbook of
Human‐
Machine Interaction: A Human‐Centered Approach,
G.A. Boy, Editor. 2011, Ashgate Publishing Limited:
England.p.417‐429.
156. Hobbs, A., et al., Three principles of human‐system
integration, in Proceedings of the 8th Australian
Aviation Psychology Symposium. 2008: Sydney,
Australia.
157. Behymer, K.J. and J.M. Flach, From Autonomous
Systems to
Sociotechnical Systems: Designing Effective
Collaborations. She Ji: The Journal of Design,
Economics,andInnovation,2016.2(2): p.105‐114.
158. Dekker, S., Drift into Failure: From Hunting Broken
ComponentstoUnderstandingComplexSystems.2011,
Farnham:AshgatePublishingCo.
159. Sheridan, T.B., Humans and Automation: System
Design and Research Issues. 2002, New
York: John
Wiley.280.
160. Sheridan, T.B., Telerobotics, automation and human
supervisorycontrol.1992,Cambridge:MITPress.
161. Bainbridge, L., Ironies of automation. Automatica,
1983.19(6):p.775‐779.
162. Woods, D.D., Decomposing Automation: Apparent
Simplicity, Real Complexity. Automation and Human
Performance: Theory and Applications, ed. R.
ParasuramanandM.Mouloua.
1996:Erlbaum.
163. Norman, D.A., The problem of automation:
Inappropriate feedback and interaction, not over‐
automation, in Human factors in hazardous situations,
D.E. Broadbent, A. Baddeley, and J.T. Reason, Editors.
1990,OxfordUniversityPress.p.585‐593.
164. Parasuraman,R.andD.H.Manzey,Complacencyand
bias in human use of
automation: an attentional
integration.HumFactors,2010.52(3):p.381‐410.
165. Parasuraman, R. and V. Riley, Humans and
Automation: Use, Misuse, Disuse, Abuse. Human
Factors: The Journal of the Human Factors and
ErgonomicsSociety,1997.39(2):p.230‐253.
166. Sauer,J.,A.Chavaillaz,andD.Wastell,Experienceof
automation
failures in training: effects on trust,
automation bias, complacency, and performance.
Ergonomics,2015: p.1‐28.
167. Chavaillaz, A., D. Wastell, and J. Sauer, System
reliability, performance and trust in adaptable
automation.AppliedErgonomics,2016.52:p.333‐342.
168. Mosier, K.L., et al., Aircrews and Automation Bias:
The Advantages
of Teamwork? The International
JournalofAviationPsychology,2001.11(1):p.1‐14.
169. Skitka, L., K.L. Mosier, and M. Burdick,
Accountability and automation bias. International
JournalofHuman‐ComputerStudies,2000.52(4):p.701‐
717.
170. Skitka, L., K.L. Mosier, and M. Burdick, Does
automationbiasdecision‐making?InternationalJournal
ofHumanComputerStudies,1999.51(5):p.991‐1006.
171. Mosier,K.L.,etal.,Automationbias:decisionmaking
and performance in high‐tech cockpits. The
InternationalJournalofAviationPsychology,1998.8(1):
p.47‐63.
172. Lee, J.D. and K.A. See, Trust in Automation:
Designing for Appropriate Reliance. Human Factors,
2004.46(1):p.50‐80.
173. Bradshaw, J.M., et al., The Seven Deadly Myths of
Autonomous Systems. IEEE Intelligent Systems, 2013.
28(3):p.54‐61.
174. Wickens,C.D.,AutomationStages&Levels,20Years
After. Journal of Cognitive Engineering and Decision
Making,2017:p.1555343417727438.
175. Riley, V., A General Model
of Mixed‐Initiative
Human‐Machine Systems. Proceedings of the Human
FactorsSocietyAnnualMeeting,1989.33(2):p.124‐128.
176. Parasuraman,R.,T.B.Sheridan,andC.D.Wickens,A
model for types and levels of human interaction with
automation. Systems, Man and Cybernetics, Part A:
SystemsandHumans,IEEETransactionson,
2000.30(3):
p.286‐297.
177. Endsley, M.R. and D.B. Kaber, Level of automation
effects on performance, situation awareness and
workloadinadynamiccontroltask.Ergonomics,1999.
42(3):p.462‐92.
178. Woods, D.D., The Risks of Autonomy. Journal of
CognitiveEngineeringandDecisionMaking,2016.10(2):
p.131‐133.
179. Klein,G.,etal.,Tenchallengesformakingautomation
aʺ team playerʺ in joint human‐agent activity. IEEE
IntelligentSystems,2004.19(6):p.91‐95.
180. MAIB, Annual report 1999. 2000, Department of the
EnvironmentTransportandRegions.:London.
181. Ventikos,N.P.,G.V.Lykos,andI.I.Padouva,Howto
achieve an effective behavioral‐based safety plan: the