Joshua A. Tabak and Vivian Zayas conducted a study on “The Roles of Featural and Configural Face Processing in Snap Judgments of Sexual Orientation,” published on 16 May 2012. The two found that “People are able to judge men’s and women’s sexual orientation with above-chance accuracy” based on a photograph shown to the participant between forty and fifty milliseconds. The photograph was in greyscale and had all head hair on the subject in the picture removed. However, Tabak and Zayas wondered how these judgements formed.
The two hypothesized that either “Configural face processing contributes to accurate snap judgments of sexual orientation” or that “The process of reading sexual orientation from faces may differ as a function of whether the stimulus person (face) is male or female.” Hypothesis one suggests that configural face processing, which is a “more individuating type of face processing,” is a large part of accurately assessing the sexuality of an individual. This is because scientists have found sexual orientation to be phenotypically unclear or inexact. Hypothesis two proposes that men and women judge differently based on their gender and the gender of the subject in the photograph.
Two experiments were then conducted. The first experiment was conducted using twenty-four students from University of Washington. Nineteen of the participants were female, five male, all between the ages of eighteen and twenty two years of age. Four hundred and thirteen images were gathered from Facebook profiles. These pictures were from people who clearly identified as either gay, straight male, straight female, or lesbian. The people in the photographs were between eighteen and twenty-nine. None had facial scars, hair, makeup, tattoos, or piercings. The trained research assistants who fixed the pictures by removing ears and head hair were blind to the experiment. “Each trial consisted of: (a) a fixation cross for 1000 ms, (b) a target face stimulus for 50 ms, and (c) a backward mask for 100 ms, after which participants categorized the target face as either “gay” or “straight” “as quickly and accurately as possible” by depressing “A” or “L.” The intertrial interval was 1000 ms.” The results of the study showed that “participants read sexual orientation significantly better than chance from upright faces of” men and women. However, upside-down faces of men and women were not read as accurately. It was also found that women’s faces were judged more accurately then men’s faces were. Tabak and Zayas found this intriguing; in the media, the prominence of gay males vs gay females is unequal, and the two expected that because of this, males would be identified more accurately as gay or straight than females would.
Experiment two used ninety-two women and thirty-seven men between the ages of eighteen and twenty-five. The same set of faces used in experiment one was used in experiment two. However, in experiment one, the face color of the males in the pictures was darker than that of the women’s, so in experiment two, research assistants lightened the faces of the males so that they matched those of the females. Participants judged either upside-down faces or upright faces. Research found that “participants read sexual orientation significantly better than chance from upright faces of women… and upright faces of men.” The result from experiment two was yes; “configural face processing” does “boost accuracy of sexual orientation judgements above the accuracy observed when judgements are promarity limited to featural face processing.” In addition, the presentation of upside-down faces disrupted configural face processing but had “little or no detectable effect on featural face processing,” which showed that “accurate judgments of women’s and men’s sexual orientation were possible even when face processing was largely restricted to featural face information.”
Tabak, Joshua A., and Vivian Zayas. “The Roles of Featural and Configural Face Processing in Snap Judgments of Sexual Orientation.” PLOS ONE:. N.p., 16 May 2012. Web. 20 Mar. 2015. <http://journals.plos.org/plosone/article?id=10.1371%2Fjournal.pone.0036671>.