Starten Sie Ihre Suche...


Durch die Nutzung unserer Webseite erklären Sie sich damit einverstanden, dass wir Cookies verwenden. Weitere Informationen

The blenderFace method: video-based measurement of raw movement data during facial expressions of emotion using open-source software

Behavior Research Methods. Bd. 51. H. 2. Springer Nature 2018 S. 747 - 768

Erscheinungsjahr: 2018

Publikationstyp: Zeitschriftenaufsatz

Sprache: Deutsch

Doi/URN: 10.3758/s13428-018-1085-9

Volltext über DOI/URN

GeprüftBibliothek

Inhaltszusammenfassung


This article proposes an optical measurement of movement applied to data from video recordings of facial expressions of emotion. The approach offers a way to capture motion adapted from the film industry in which markers placed on the skin of the face can be tracked with a pattern-matching algorithm. The method records and postprocesses raw facial movement data (coordinates per frame) of distinctly placed markers and is intended for use in facial expression research (e.g., microexpressions) i...This article proposes an optical measurement of movement applied to data from video recordings of facial expressions of emotion. The approach offers a way to capture motion adapted from the film industry in which markers placed on the skin of the face can be tracked with a pattern-matching algorithm. The method records and postprocesses raw facial movement data (coordinates per frame) of distinctly placed markers and is intended for use in facial expression research (e.g., microexpressions) in laboratory settings. Due to the explicit use of specifically placed, artificial markers, the procedure offers the simultaneous measurement of several emotionally relevant markers in a (psychometrically) objective and artifact-free way, even for facial regions without natural landmarks (e.g., the cheeks). In addition, the proposed procedure is fully based on open-source software and is transparent at every step of data processing. Two worked examples demonstrate the practicability of the proposed procedure: In Study 1(N=39), the participants were instructed to show the emotions happiness, sadness, disgust, and anger, and in Study 2 (N=113), they were asked to present both a neutral face and the emotions happiness, disgust, and fear. Study 2 involved the simultaneous tracking of 14 markers for approximately 12 min per participant with a time resolution of 33 ms. The measured facial movements corresponded closely to the assumptions of established measurement instruments (EMFACS, FACSAID, Friesen {\&} Ekman, 1983; Ekman {\&} Hager, 2002). In addition, the measurement was found to be very precise with sub-second, sub-pixel, and sub-millimeter accuracy.» weiterlesen» einklappen

Klassifikation


DFG Fachgebiet:
Psychologie

DDC Sachgruppe:
Psychologie

Verbundene Forschungsprojekte


Verknüpfte Personen