325
growing the attention as satisfaction scenario
increase(meangrowhigh→veryhigh)inGaze
trackingparameter(Gazeverticalparameter>1
viewthescreen).
TheEngineroomSimulatortraininggrowsthejob
security and performance, has professional value
andhelptoprofessionaldevelopment(motivation
model).
In lexical
analysis, we observe the total word of
answer’s users depending from satisfaction
(growing the mean of Total words from high →
very high satisfaction)and the IndexWord
nonSatisf
< IndexWord
Satisf(from high → very high
satisfaction).
In the SUS score has satisfactory rating and
growing the score from high → very high
satisfaction.
Highusability(easytouse,easytolearn).
Finally,theTotalSatisfactionIndex(TSI)ishighin
sample(meanTSI:1.3~characterizationof
‘high’).
The connection between all above elements
resultedfromtheprocessingoftheopticalregistration
dataandtheusers’interview&questionnaires.
Theapproachisgeneralinthesensethatitcanbe
appliedinvarioustypesofe‐learningmarinesystems.
It is also pluralistic in the sense that it
provides the
evaluator with complementary sources of data that
can reveal important aspects of the user experience
duringshipcontrol.Certainly,theproposedapproach
may require further adaptations to accommodate
evaluationofparticularinteractivesystems.
REFERENCES
Asteriadis, S. Tzouveli, P. Karpouzis, K. Kollias, S. 2009.
Estimation of behavioral user state based on eye gaze
and head pose—application in an e‐learning
environment, Multimedia Tools and Applications,
Springer,Volume41,Number3/February,pp.469‐493.
Brooke,J.1996.SUS:A“quickanddirty”usabilityscale.In:
Jordan, P. W., Thomas, B., Weerdmeester, B. A.,
McClelland (eds.) Usability Evaluation in Industry,
Taylor&Francis,London,UKpp.189‐194.
Cheng,D.Zhao,Z.Lu,J.Tu,D.2010.AKindofModelling
and Simulating Method for Eye Gaze Tracking HCI
System,Proceedingsof 3rd InternationalCongress on
ImageandSignalProcessing(CISP2010),IEEE,EMB,pp.
511‐514.
Cournia, N. Smith, J. D. Duchowski, A.T. 2003. Gaze‐ vs.
hand‐based pointing in virtual environments, in: CHI
‘03(Ed.),CHI‘03extendedabstractsonHumanfactors
in computing systems, ACM Press, Ft. Lauderdale,
Florida,USA,pp.772–773.
Dix, A. Finlay,
J. Abowd, G. D. Beale, R. 2004. Human‐
ComputerInteraction,UK:PearsonEducationLimited.
Duchowski,A. T. 2007.Eyetracking methodology:Theory
andpractice,NY:Springer.
Fontaine,J.R.Scherer,K.R. Roesch, E. B. Ellsworth,P.C.
2007. The world of emotions is not two‐dimensional ,
PsychologicalScience,18(2),pp.
1050‐1057.
Fotopoulou, A. Mini,M. Pantazara,M. Moustaki, A. 2009.
“La combinatoire lexicale des noms de sentiments en
grecmoderne”,inLelexiquedesemotions,I.Navacova
andA.Tutin,Eds.Grenoble :ELLUG.
Galin, D. and Ornstein, R. 1974. Individual Differences in
Cognitive Style—I. Reflective Eye Movements,
Neuropsychologia,vol.12,
pp.367‐376.
HannusM,HyonaJ.1999.Utilizationofillustrationduring
learning of science textbook passages among low‐ and
high‐ability children, Contemporary Educational
Psychology24,pp.95‐123.
Hansen,D.W.,Qiang,Ji2010.IntheEyeoftheBeholder:A
Survey of Models for Eyes and Gaze Pattern Analysis
andMachineIntelligence,IEEETransactionson,Vol.32,
Is.3,pp.478‐500.
HagertyM,JustMA.1993.Constructingmentalmodelsof
machines from text and diagrams, Journal of Memory
andLanguage32,pp.71‐42
Holsanova J, Holmberg N, Holmqvist K. 2009. Reading
informationgraphics:therole of spatialcontiguityand
dual attentional
guidance, Applied Cognitive
Psychology23,pp.1215‐26.
Hyona J, Niemi P. 1990. Eye movements during repeated
readingofatext,ActaPsychologica73,pp.259‐80.
Jacob, R. 1990. What you look at is what you get: eye
movement‐based interaction techniques, in: CHI ‘90:
Proceedings of the SIGCHI conference
on Human
factors in computing systems, ACM, Seattle,
Washington,UnitedStates,pp.11–18.
JustMA,CarpenterPA.1980.ATheoryofreading:From
eye fixations to comprehension, Psychological Review
87,pp.329‐55.
IMO‐International,MaritimeOrganization,2003. Issues for
trainingseafarersresultingfromtheimplementationon
boardtechnology,
STW34/INF.6.
Isokoski,P.Martin,P.B.2006.EyeTrackerInputinFirst
Person Shooter Games, in: Proceedings of the 2nd
Conference on Communication by Gaze Interaction:
Communication by Gaze Interaction – COGAIN 2006:
GazingintotheFuture,Turin,Italy,pp.78–81.
Istance, H. Vickers, S. Hyrskykari, A. 2009. Gaze‐
based
interaction with massively multiplayer on‐line games,
in: Proceedings of the 27th international conference
extended abstracts on Human factors in computing
systems,ACM,Boston,MA,USA,pp.4381–4386.
Kotzabasis, P.2011. Human‐Computer Interaction:
Principles, methods and examples, Athens,
Kleidarithmos(inGreek).
Lambov, D. Pais, S. Dias, G. 2011. Merged
Agreement
Algorithms for Domain Independent Sentiment
Analysis, Pacific Association, For Computational
Linguistics (PACLING 2011), Procedia‐Socila and
BehaviouralSciences,27,pp.248‐257.
Lazarous, R. S. 1982. Thoughts on the Relation between
EmotionandCognition,AmericanPsychologist,24,pp.
210‐222.
Malatesta,L.2009.“Human–ComputerInteractionbased
in analysis and synthesis optical data”, Phd Thesis,
Athens(inGreek),NTUA.
Maughan,L.Gutnikov,S.Stevens,R.2007.Likemore,look
more, look more, like more: The evidence from eye‐
tracking, The Journal of Brand Management 14 (2007)
335–342,doi:10.1057/palgrave.bm.2550074.
Mueller,S.C.Jackson,C.P.T.andSkelton,R.W.2008.
Sex
Differences in a Virtual Water Maze: An Eye Tracking
and Pupillometry Study, Behavioural Brain Research,
vol.193,pp.209‐215.
NackeL.E.Stellmach,S.Sasse,D.Niesenhaus, J.Dachselt,R
2011. LAIF: A logging and interaction framework for
gaze‐based interfaces in virtual entertainment
environments, Entertainment Computing 2 ,
pp. 265–
273.
Pinker, S. Jackendorff, R. 2005. The faculty of language:
what’sspecialaboutit?,Cognition,95,pp.201‐236.
Papachristos, D. Alafodimos, K. Nikitakos, N. 2012.
Emotion Evaluation of Simulation Systems in
Educational Practice, Proceedings of the International
ConferenceonE‐LearningintheWorkplace(ICELW12),
13‐15June,
NY:KaleidoscopeLearning,www.icelw.org.
RaynerK.1998.Eyemovementsinreadingandinformation
processing: 20 years of research,Psychological Bulletin
124,pp.372‐422.