Why Matters
Who Matters
Journals arrow_drop_down
Instructions arrow_drop_down
Info arrow_drop_down
###### Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

# Natural dynamic backgrounds affect perceived facial dominance

Affiliation listing not available.

We have tested whether natural dynamic backgrounds affect perceived facial dominance. Facial evaluation is based on just two fundamental dimensions of facial appearance: valence and dominance. Perceived facial valence has been shown to be biased towards background valence. However, it is currently unknown, if the perception of facial dominance is also context dependent. In this study, participants rated the perceived dominance of neutral faces superimposed on everyday dynamic backgrounds that were classified as either low (weak) or high (strong) in dominance. Neutral faces were perceived as significantly more dominant when seen on a strong dynamic background than on either a weak or neutral background. Thus, background dominance enhances perceived facial dominance. Since dynamic textures are ubiquitous this finding is relevant for the design and experience of both our daily environment and multimedia content.

Figure 1.

(A) The 12 neutral faces from the Princeton faces database[1] overlaid on dynamic textures from the Dyntex database[2] that are classified as either strong (upper 2 rows) or weak (lower 2 rows)[3].

(B) Median dominance ratings for neutral faces shown on the dynamic textures from the Dyntex database[2] that are classified as either strong (upper row) or weak (lower row)[3]. In this figure, the background images are shown in lighter shades than those actually used in the experiment.

(C) Tukey boxplot of the median dominance ratings for neutral faces shown on respectively the strong (upper 6; top-down corresponding to left-right in Fig. 1B) and weak (lower 6) backgrounds. Thick line denotes the median; stars denote outliers (values more than 1.5 IQR below the first quartile or above the third quartile).

People are typically unaware that the context in which they see a face influences their affective judgement, even when this context has absolutely no relevance for their assessment[4][5][6][7][8]. Associations between context and faces are routinely established and modulate face perception already at the early stages of facial feature processing, such that the affective quality of the context transfers to the perceived affective state of a face[4][8]. As a result, identical facial configurations may convey strikingly different emotions and dimensional values depending on the context in which they are perceived[5]. Previous studies shown that perceived facial valence is biased towards background valence; faces perceived in negative (or positive) contexts are judged to be more negative (or positive) than in a neutral context[5][6]. However, it is currently unknown if the perception of facial dominance is also context dependent. Environmental psychology has shown that the dominance is one of the principal affective qualities of backgrounds and environments. The dominance (sometimes also called strength or potency) of a background is defined as the degree to which it affects the observer[9][10]. An observer may feel overwhelmed by, and unable to control, a strong background, while (s)he may feel in control of, and able to influence, a weak background media. We hypothesised that background dominance may also biased perceived facial dominance. To test this hypothesis, we performed an experiment in which observers judged the dominance of neutral faces presented on natural dynamic textures (spatially repetitive, time-varying visual patterns that repeat, or seems to repeat, themselves over time;[11] these were classified as neutral, low (weak) or high (strong) in dominance (Fig. 1A).

The objective of this study was to investigate whether perceived facial dominance is biased by the dominance of everyday dynamic backgrounds.

The results of the experiment were first accumulated by calculating the median dominance rating per background across all 12 faces for each of the 30 participants (Fig. 1B and C). Next, these median dominance ratings were accumulated across the 6 strong backgrounds and across the 6 weak backgrounds individually (again by using the median). These ratings were then compared with each other and with neutrality (i.e. zero). A Friedman test revealed a significant effect of background on dominance rating ($chi^2$ = 13.972, p = 0.001). Next, a post-hoc analysis was performed using multiple Wilcoxon singed rank tests, which were conducted using Bonferroni-adjusted alpha levels of 0.0167 per test (0.05/3). The results show a significant difference between perceived dominance ratings of faces on strong and weak backgrounds respectively (Z = 3.033, p = 0.002). Also, perceived dominance ratings for faces on strong backgrounds are significantly different from neutral ones (Z = 3.305, p = 0.001). However, perceived dominance ratings for faces on weak backgrounds are not significantly different from neutral ones (Z = 2.230, p = 0.026). Neutral faces are perceived as significantly more dominant when seen against a strong background than on either a weak or a neutral (dark) background. A Mann-Whitney U test was performed to test for gender difference. For neutral faces shown on strong backgrounds this test revealed no significant difference between dominance ratings by males (Md = 0.32, N = 18) and females (Md = 0.23, N = 12), U = 86.5, Z = -0.911, p = 0.368, r = -0.16. For neutral faces shown on weak backgrounds there was also no significant difference between males (Md = 0.08, N = 18) and females (Md = 0.12, N = 12) , U = 92.0, Z = -0.678, p = 0.518, r = -0.12.

Our current finding that even everyday dynamic textures such as streaming water, swirling leaves, moving clouds, waving flags or traffic streams can influence perceived facial dominance agrees with the growing body of evidence that background context modulates perceived facial emotions[5][6][7]. The current results also agree with recent brain studies showing that contextual information influences activities in the extended neural network of face processing and thus alters the perception and evaluation of facial expressions[12][13][8][14]. In particular, it has recently been observed that the amygdala integrates facial expression with salient motion information[15]. This indicates that the amygdala is not only is responsive to facial expressions[16] but also incorporates the overall perceptual context of a stimulus. Hence, it appears that facial evaluation is context dependent and not automatic, hard-wired, effortless and universal as previously proposed[17].

In addition to their high prevalence in our daily environment, dynamic textures are also increasingly applied in animation[18] and video synthesis[11], and are deployed on large-scale digital billboards and electronic wallpapers[19]. Because of this ubiquity of dynamic textures, our current findings may be relevant for the design and experience of both our daily environment and multimedia content.

Everyday dynamic background textures such as streaming water, swirling leaves, moving clouds, waving flags or traffic streams can bias perceived facial dominance.

A limitation of this study is the small number of stimuli (faces and backgrounds) that were tested. Future studies should use a larger number of dynamic background textures with widely varying content and motion patterns, covering the entire dominance range, to allow a closer investigation (correlation) of the relation between background dominance and perceived facial dominance. In addition, artificial affective motion textures with well specified path curvature, speed and texture layout[18] may serve to systematically investigate the relation between different spatio-temporal texture parameters and perceived facial dominance.

We expect that visual background dominance may also bias the affective appraisal of non-face objects with no evident semantic affective connotation. Moreover, it is also likely that this effect may carry over to other sensory modalities[20]. Hence, a dominant visual background may bias the perception of certain smells, tastes or sounds. We plan to investigate these issues in a follow-up study.

30 (12 female) observers (ages 21–64 years, mean age 40 years) rated the dominance of 12 neutral faces overlaid on 12 different dynamic backgrounds, using a 5 point rating scale ranging from very submissive to very dominant. 6 of these backgrounds have previously been classified as high in dominance (strong backgrounds: mean dominance rating 0.85±0.07 on a scale from 0 to 1, N = 35[3]) while 6 others have been classified as low in dominance (weak backgrounds: mean dominance rating 0.22±0.06, N = 35[3]).

The 12 neutral faces were selected from the Dominance data set of the validated Princeton faces database[1]. Their neutrality had been verified in a previous study for a homogeneous dark background[22]. Neutral faces were used as targets since their evaluation is affected by emotional scene content to a greater extent than the evaluation of faces with exaggerated facial expressions[5][6][8], probably because of their ambiguous nature[23][24]. In addition, by using neutral faces, issues of stimulus-background congruency can be avoided[5].

The 12 dynamic backgrounds were different natural textures from the Dyntex database (AVI movies with a resolution of 600×480 pixels, a duration of 10 s, and a frame rate of 25 fps[2]), representing everyday background scenes like moving water, fluttering vegetation and a waving flag. In an earlier study[3] 6 of these textures (with identifiers 54ab110, 64adl10, 648dc10, 649ha10, 6484d10, and 6485110 in the Dyntex database) were classified as high in dominance (strong), and 6 others (with identifiers 54ac110, 571b110, 645ab10, 6486b10, 6482210, 6485310) were classified as low in dominance (weak; Fig. 1B).

In contrast to the previous studies on the effects of affective backgrounds on facial evaluation[5][6] the backgrounds used in this study are dynamic and have no evident semantic affective connotation. Each face was overlaid on each dynamic background, resulting in a total of 144 different stimuli (12 faces×12 backgrounds).

Dell Precision 490 PCs were used to present the stimuli to the observers in random order and to register their response. The computers were equipped with Dell 19" monitors, with a screen resolution of 1280×1024 pixels, and a screen refresh rate of 60 Hz. MediaLab v2012 (www.empirisoft.com) was used to present the stimuli and collect the answers. The stimuli were presented for maximally 10 s on a light grey background, flanked by a rating scale. If a participant responded within 10 s after the onset of a stimulus presentation, the current face would disappear and the next face would be shown. If a participant needed more than 10 s to respond, the stimulus disappeared from the screen but the rating scale remained visible until the participant had responded. Participants were instructed to base their answers solely on their first and overall impression of each face and to ignore the background. Observers used standard mouse pointers to indicate their response. Statistical analyses were performed with IBM SPSS 20.0 for Windows. As the experiments used an ordinal scale of measurement, without assuming it is also an interval scale, non-parametric tests were used.

The authors declare no conflicts of interest.

The participants read and signed an informed consent prior to the experiment. The experimental protocol was reviewed and approved by the TNO Ethics Committee and was in accordance with the Helsinki Declaration of 1975, as revised in 2013[21].

No fraudulence is committed in performing these experiments or during processing of the data. We understand that in the case of fraudulence, the study can be retracted by ScienceMatters.

1. Nikolaas N. Oosterhof, Alexander Todorov
The functional basis of face evaluation
Proceedings of the National Academy of Sciences, 105/2008, pages 11087-11092 DOI: 10.1073/pnas.0805664105chrome_reader_mode
2. Péteri Renaud, Fazekas Sándor, Huiskes Mark J.
DynTex: A comprehensive database of dynamic textures
Pattern Recognition Letters, 31/2010, pages 1627-1632 DOI: 10.1016/j.patrec.2010.05.009chrome_reader_mode
3. Alexander Toet, Menno Henselmans, Marcel P Lucassen, Theo Gevers
Emotional Effects of Dynamic Textures
i-Perception, 2/2011, pages 969-991 DOI: 10.1068/i0477chrome_reader_mode
4. Lisa Feldman Barrett, Batja Mesquita, Maria Gendron
Context in Emotion Perception
Current Directions in Psychological Science, 20/2011, pages 286-290 DOI: 10.1177/0963721411422522chrome_reader_mode
5. Koji Shahnaz, Fernandes Myra
Does it matter where we meet? The role of emotional context in evaluative first impressions.
Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, 64/2010, pages 107-116 DOI: 10.1037/a0019139chrome_reader_mode
6. Lee Tae-Ho, Choi June-Seek, Cho Yang Seok
Context Modulation of Facial Emotion Perception Differed by Individual Difference
7. Ruthger Righart, Beatrice de Gelder
Recognition of facial expressions is influenced by emotional scene gist
Cognitive, Affective, & Behavioral Neuroscience, 8/2008, pages 264-272 DOI: 10.3758/cabn.8.3.264chrome_reader_mode
8. Matthias J. Wieser, Tobias Brosch
Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing
Frontiers in Psychology, 3/2012, page 471 DOI: 10.3389/fpsyg.2012.00471chrome_reader_mode
9. Albert Mehrabian
Basic Dimensions for a General Psychological Theory Implications for Personality, Social, Environmental, and Developmental Studies
Cambridge and MA: Oelgesschlager and PhilPapers, 1980 chrome_reader_mode
10. Albert Mehrabian, James A. Russell
A Verbal Measure of Information Rate for Studies in Environmental Psychology
Environment and Behavior, 6/1974, pages 233-252 chrome_reader_mode
11. Gianfranco Doretto, Alessandro Chiuso, Ying Nian Wu, Stefano Soatto
Dynamic Textures
International Journal of Computer Vision, 51/2003, pages 91-109 chrome_reader_mode
12. Katharina A. Schwarz, Matthias J. Wieser, Antje B. M. Gerdes, Andreas Mühlberger, Paul Pauli
Why are you looking like that? How the context influences evaluation and processing of human faces
Social Cognitive and Affective Neuroscience, 8/2012, pages 438-445 DOI: 10.1093/scan/nss013chrome_reader_mode
13. Jan van Den Stock, Mathieu Vandenbulcke, Charlotte B. A. Sinke, Rainer Goebel, Beatrice de Gelder
How affective information from faces and scenes interacts in the brain
Social Cognitive and Affective Neuroscience, 9/2014, pages 1481-1488 DOI: 10.1093/scan/nst138chrome_reader_mode
14. Matthias J. Wieser, Antje B.M. Gerdes, Inga Büngel,more_horiz, Paul Pauli
Not so harmless anymore: How context impacts the perception and electrocortical processing of neutral faces
15. Hindi Attar C., Muller M. M., Andersen S. K.,more_horiz, Rose M.
Emotional Processing in a Salient Motion Context: Integration of Motion and Emotion in Both V5/hMT+ and the Amygdala
Journal of Neuroscience, 30/2010, pages 5204-5210 DOI: 10.1523/jneurosci.5029-09.2010chrome_reader_mode
16. Harris R. J., Young A. W., Andrews T. J.
Morphing between expressions dissociates continuous from categorical representations of facial expression in the human brain
Proceedings of the National Academy of Sciences, 109/2012, pages 21164-21169 DOI: 10.1073/pnas.1212207110chrome_reader_mode
17. Paul Ekman
An argument for basic emotions
Cognition and Emotion, 6/1992, pages 169-200 DOI: 10.1080/02699939208411068chrome_reader_mode
18. Matt Lockyer, Lyn Bartram
Affective motion textures
Computers & Graphics, 36/2012, pages 776-790 DOI: 10.1016/j.cag.2012.04.009chrome_reader_mode
19. Jeffrey Huang, Muriel Waldvogel
Interactive wallpaper
ACM SIGGRAPH 2005 Electronic Art and Animation Catalog, SIGGRAPH '05/2005, pages 172-176 DOI: 10.1145/1086057.1086142chrome_reader_mode
20. Eliane Schreuder, Jan van Erp, Alexander Toet, Victor L. Kallen
Emotional Responses to Multisensory Environmental Stimuli: A Conceptual Framework and Literature Review
SAGE Open, 6/2016, page 2158244016630591 DOI: 10.1177/2158244016630591chrome_reader_mode
21. World Medical Association
World Medical Association Declaration of Helsinki: Ethical Principles for Medical Research Involving Human Subjects
JAMA, 310/2013, pages 2191-2194 DOI: 10.1001/jama.2013.281053chrome_reader_mode
22. Alexander Toet, Susanne Tak
Look Out, There is a Triangle behind You! The Effect of Primitive Geometric Shapes on Perceived Facial Dominance
i-Perception, 4/2013, pages 53-56 DOI: 10.1068/i0568saschrome_reader_mode
23. Rebecca E. Cooney, Lauren Y. Atlas, Jutta Joormann, Fanny Eugène, Ian H. Gotlib
Amygdala activation in the processing of neutral faces in social anxiety disorder: Is neutral really neutral?
Psychiatry Research: Neuroimaging, 148/2006, pages 55-59 DOI: 10.1016/j.pscychresns.2006.05.003chrome_reader_mode
24. Yoon K. Lira, Zinbarg Richard E.
Interpreting neutral faces as threatening is a default mode for socially anxious individuals.
Journal of Abnormal Psychology, 117/2008, pages 680-685 DOI: 10.1037/0021-843x.117.3.680chrome_reader_mode