Motion Recognition and Students’ Achievement

Wen-Fu Pan, Anton Subarno, Mei-Ying Chien, Ching-Dar Lin


Human motion has multifarious meanings that can be recognized using a facial detection machine. This article aims to explore body motion recognition to explain the relationship between students’ motions and their achievement, as well as teachers’ responses to students’ motions, and especially to negative ones. Students’ motions can be identified according to three categories; facial expression, hand gestures, and body position and movement. Facial expression covers four categories, namely, contempt, fear, happiness, and sadness. Contempt is used to express conflicted feelings, fear to express unpleasantness, happiness to express satisfaction, and sadness to express that the environment is uncomfortable. Hand gestures can likewise be grouped into four categories: conversational gestures, controlling gestures, manipulative gestures, and communicative gestures. Conversational gestures refer to communicative gestures. Controlling gestures refer to vision-based interface communications, like the ones popular in current technology. Manipulative gestures refer to ones used in human interaction with virtual objects. Communicative gestures relate to human interaction, and therefore involve the field of psychology. Body position and movement also can be classified into four categories, namely: leaning forward, leaning backward, correct posture, and physical relocation. Leaning forward happens when a user is working with a high level of concentration. Leaning backward occurs when a user has been highly concentrated on work for several hours, and needs a break or change. Correct posture is the sign of an enjoyable working position which involves sitting in a free and relaxed manner. Movement refers to a change to the student’s sitting location, reflecting some inadequacy of the learning environment.

Teachers can anticipate changes of students’ emotions by good learning design, teaching metacognitive skills, self-regulated performance, exploratory talks, mastery approach/avoidance, using hybrid learning environments, and controlling space within classrooms. Teachers’ responses to students’ motions will be explored in this article


motion recognition, facial expression, hand gestures, body position

Full Text:



. G. Castellano, S.D. Villalba, and A. Camurri, Recognising human emotions from body movement and gesture dynamics. In Proceeding Second International Conference, ACII 2007 (Lisbon, Portugal, September12-14, 2007) LNCS 4738, 71–82, 2007. doi 10.1007/978-3-540-74889-2_64

. R. Pekrun, T. Goetz, W. Titz,, and R.P. Perry, Academic emotions in students’ self-regulated learning and achievement: a program of qualitative and quantitative research. Educational Psychologist. 37, 2 , Jun 2010, 91-105.

. D. Bernhardt, and P. Robinson, Detecting emotions from connected action sequences. In Proceeding First International Visual Informatics Conference, IVIC 2009 (Kuala Lumpur, Malaysia, November 11-13, 2009) 738-747, 2009. DOI: 10.1007/978-3-642-05036-7_1.

. A.M. Hamamorad, Teacher as mediator in the efl classroom: a role to promote students’ level of interaction, activeness, and learning. International Journal of English Language Teaching. 4,1 (Jan 2016), 64-70.

. T. Marler, S. Beck1, U. Verma1, R. Johnson1, V. Roemig1, and B. Dariush, A digital human model for performance-based design. In Proceedings 5th International Conference, DHM 2014 Held as Part of HCI International 2014 (Heraklion, Crete, Greece, June 22–27, 2014). LNCS 8529, 136-147. DOI: 10.1007/978-3-319-07725-3_13

. A. Wexelblat, Virtual reality: Applications and explorations, Academic Press: New York, 1993.

. S. Shojaeilangari, W. Y. Yau, J. Li, and E. K. Teoh, Feature extraction through binary pattern of phase congruency for facial expression recognition. In Proceeding 12th International Conference on Control Automation Robotics & Vision (ICARCV), 166-170, 2012.

. J.P. Can´ario, and L. Oliveira, Recognition of facial expressions based on deep conspicuous net. In Proceeding 20th Iberoamerican Congress, CIARP 2015 (Montevideo, Uruguay, November 9-12 2015) LNCS 9423, 255-262, 2015. DOI: 10.1007/978-3-319-25751-8 31

. H. S. Badi, and S. Hussein, Hand posture and gesture recognition technology. Neural Computing & Applications, 25 (Apr 2014), 871-878. doi:10.1007/s00521-014-1574-4

. Q. Luo, Study on three dimensions body reconstruction and measurement by using kinect. In Proceedings 5th International Conference, DHM 2014 Held as Part of HCI International 2014 (Heraklion, Crete, Greece, June 22–27, 2014). LNCS 8529, 35-42. DOI: 10.1007/978-3-319-07725-3_4

. S. Majed, H. Arof, and Z. Hashmi, Orientation features-based face detection by using local orientation histogram framework. In Proceeding First International Visual Informatics Conference, IVIC 2009 (Kuala Lumpur, Malaysia, November 11-13, 2009), 2009. 738-747. DOI: 10.1007/978-3-642-05036-7_70

. M. Gavrilescu, Recognizing human gestures in videos by modeling the mutual context of body position and hands movement. Multimedia Systems. 22, 2 (Mar 2016), 1–13. DOI 10.1007/s00530-016-0504-y

. T. Hachaja, and M. R. Ogielab, Full body movements recognition – unsupervised learning approach with heuristic R-GDL method. Digital Signal Processing. 46 (Jul 2015), 239–252. DOI:

. M. Gowing, A. Ahmadi, F. Destelle, D. S. Monaghan, N. E. O’Connor, and K. Moran, Kinect vs. Low-cost inertial sensing for gesture recognition. In Proceedings Part I 20th Anniversary International Conference, MMM 2014 (Dublin, Ireland, January 6-10, 2014). LNCS 8325, 484–495. DOI 10.1007/978-3-319-04114-8_41

. J. Johnson, Postural assessment. International Therapist. 99, (Jan 2012), 36-38.

. S. Shojaeilangari1, W. Yau, and E. Teoh, Pose-invariant descriptor for facial emotion recognition. Machine Vision and Applications. 27, 5 (Jul 2016), 1063–1070. DOI 10.1007/s00138-016-0794-2

. D. Bernhardt, Posture, gesture and motion quality: A multilateral approach to affect recognition from human body motion. In Proceedings of the Doctoral Consortium at the Second International Conference on Affective Computing and Intelligent Interaction, ACII’07 (Lisbon, Portuga, September 12-14, 2007), 49–56. DOI: 10.1007/978-3-540-74889-2

. S. Bilal, R. Akmeliawati, A. A. Shafie, and M.J.E. Salami, Hidden Markov model for human to computer interaction: a study on human hand gesture recognition. Artif Intell Rev, 40, 2013, 495–516. DOI 10.1007/s10462-011-9292-0

. J. Whitehill, G. Littlewort, I. Fasel, M. Bartlett, and J. Movellan, Towards practical smile detection. IEEE Transactions on Pattern Analysis and Machine Intelligence. 31, 11 (Nov 2009), 2106-2111.

. K. Wongpatikaseree, H. Kanai, and Y. Tan, Context-aware posture analysis in a workstation-oriented office environment. In Proceedings 5th International Conference, DHM 2014 Held as Part of HCI International 2014 (Heraklion, Crete, Greece, June 22–27, 2014). LNCS 8529, 148-159. DOI: 10.1007/978-3-319-07725-3_14

. Z. Zeng, Y. Fu, G.I. Roisman, Z. Wen, Y. Hu, and T.S. Huang, Spontaneous emotional facial expression detection. Journal of Multimedia. 1, 5 (Aug 2006), 1-8.

. G. Gläser-Zikuda, I. Stuchlíková, and J. Janík, Emotional aspects of learning and teaching: Reviewing the field − discussing the issues. Orbis Scholae. 7, 2 (2013), 7−22.

. K. Zomorodian, M. Parva, I. Ahrari, S. Tavana, C. Hemyari, K. Pakshir, P. Jafari, and A. Sahraian, The effect of seating preferences of the medical students on educational. Med Educ Online. 17, 2012, 10448 -

. S. Kalinowski, and M.L. Taper, The effect of seat location on exam grades and student perceptions in an introductory biology class. Journal of College Science Teaching. (Feb 2007), 54-57.

. K.K. Perkins, and C.E. Wieman, The surprising impact of seat location on student performance. Phys. Teach. 43, 30 (Dec 2005). DOI:

. M.D. Meeks, T.L. Knotts, K.D. James, F. Williams, J.A. Vassar, and A.O. Wren, The impact of seating location and seating type on student performance. Educ. Sci. 3, (Oct 2013), 375-386. DOI:10.3390/educsci3040375

. I. Fritea, and R. Fritea, Can motivational regulation counteract the effects of boredom on academic achievement? Procedia - Social and Behavioral Sciences. 78, (2013), 135 – 139. doi: 10.1016/j.sbspro.2013.04.266

. J. L. Plass, S. Heidig, E.O. Hayward, B.D. Homer, and E. Um, Emotional design in multimedia learning: Effects of shape and color on affect and learning. Learning and Instruction. 29, (Feb 2014), 128-140. DOI:

. C. Wecker, and F. Fischer, From guided to self-regulated performance of domain-general skills:The role of peer monitoring during the fading of instructional scripts. Learning and Instruction. 21, (May 2011), 746-756. DOI:10.1016/j.learninstruc.2011.05.001

. R. Wegerif, N. Mercer, and L. Dawes, From social interaction to individual reasoning: An empirical investigation of a possible sociocultural model of cognitive development. Learning and Instruction. 9, (1999), 493-516.

. D.W. Putwain, P. Sander, and D. Larkin, Using the 2×2 framework of achievement goals to predict achievement emotions and academic performance. Learning and Individual Differences. 25, (Jan 2013) 80–84. DOI:

. N.T. Butz, R.H. Stupnisky, and R. Pekrun, Students’ emotions for achievement and technology use in synchronous hybrid graduate programmes: a control-value approach. Research in Learning Technology. 23, (Mar 2015), 26097.DOI:

. D. Davies, D. Jindal-Snape, C. Collier, R. Digby, P. Hay, and A. Howe, Creative learning environments in education—A systematic literature review. Thinking Skills and Creativity. 8, (April 2013), 80-91. DOI:



  • There are currently no refbacks.

SISFORMA: Journal of Information Systems | p-ISSN: 2355-8253 | e-ISSN: 2442-7888 | View My Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.