hig.sePublications
Change search
Refine search result
1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Karlsson, Kristina P.
    et al.
    Department of Psychology, Stockholm University, Stockholm, Sweden.
    Sikström, Sverker
    Department of Psychology, Lund University, Lund, Sweden.
    Jönsson, Fredrik U.
    Department of Psychology, Stockholm University, Stockholm, Sweden.
    Gustafsson Sendén, Marie
    Department of Psychology, Stockholm University, Stockholm, Sweden.
    Willander, Johan
    University of Gävle, Faculty of Health and Occupational Studies, Department of Occupational Health Science and Psychology, Psychology.
    Gender differences in autobiographical memory: females latently express communality more than do males2019In: Journal of Cognitive Psychology, ISSN 2044-5911, E-ISSN 2044-592X, Vol. 31, no 7Article in journal (Refereed)
    Abstract [en]

    Gender differences have been found in several aspects of autobiographical memory (i.e. personally experienced events). For example, previous studies have shown that females’ autobiographical memories contain more communal and emotional expressions than do males. However, an important question concerns whether these differences can be observed both in the manifest content (i.e. what is actually said) and in the latent content (i.e. the underlying meaning of what is said). In the present exploratory study, we extended the current knowledge concerning gender differences in autobiographical memory by investigating the manifestly expressed words, as well as the latently expressed words in autobiographical memory descriptions. We observed an overall gender difference in the latent content of the autobiographical memories. Furthermore, females latently described their memories in more communal terms than males did. No other gender differences were found. Our results indicate that females’ autobiographical memories are more communally oriented than male's.

  • 2.
    MacCutcheon, Douglas
    et al.
    University of Gävle, Faculty of Engineering and Sustainable Development, Department of Building Engineering, Energy Systems and Sustainability Science, Environmental Science.
    Hurtig, Anders
    University of Gävle, Faculty of Engineering and Sustainable Development, Department of Building Engineering, Energy Systems and Sustainability Science, Environmental Science.
    Pausch, Florian
    Aachen University, Aachen, Germany.
    Hygge, Staffan
    University of Gävle, Faculty of Engineering and Sustainable Development, Department of Building Engineering, Energy Systems and Sustainability Science, Environmental Science.
    Fels, Janina
    Aachen University, Aachen, Germany.
    Ljung, Robert
    University of Gävle, Faculty of Engineering and Sustainable Development, Department of Building Engineering, Energy Systems and Sustainability Science, Environmental Science.
    Second language vocabulary level is related to benefits for second language listening comprehension under lower reverberation time conditions2019In: Journal of Cognitive Psychology, ISSN 2044-5911, E-ISSN 2044-592X, Vol. 31, no 2, p. 175-185Article in journal (Refereed)
    Abstract [en]

    The acoustic qualities of a room can have a deleterious effect on the quality of speech signals. The acoustic measurement of reverberation time (RT) has shown to impact second language (L2) speech comprehension positively when lower due to release from spectral and temporal masking effects as well as top-down processing factors. This auralization experiment investigated the benefits of better L2 vocabulary and executive function (updating) skills during L2 listening comprehension tests under shorter versus longer RT conditions (0.3 and 0.9 s). 57 bilingual university students undertook L2 vocabulary, number updating and L2 listening comprehension tests. After splitting groups into high/low vocabulary and updating groups, a mixed ANOVA was conducted. The high number updating group showed no significant differences or interactions in L2 listening comprehension than the lower number updating group across RT conditions. The high vocabulary group had 22% better L2 listening comprehension than the low vocabulary group in long RT, and 9% better in short RT. A significant benefit in L2 listening comprehension due to release from reverberation was only evident in the high vocabulary group. Results indicate that the benefit of good room acoustics for listening comprehension is greatest for those with better language (vocabulary) ability.

  • 3.
    Marois, Alexandre
    et al.
    École de psychologie, Université Laval, Québec, QC, Canada.
    Vachon, François
    University of Gävle, Faculty of Engineering and Sustainable Development, Department of Building, Energy and Environmental Engineering, Environmental psychology. École de psychologie, Université Laval, Québec, QC, Canada.
    Can pupillometry index auditory attentional capture in contexts of active visual processing?2018In: Journal of Cognitive Psychology, ISSN 2044-5911, E-ISSN 2044-592X, Vol. 30, no 4, p. 484-502Article in journal (Refereed)
    Abstract [en]

    The rare presentation of a sound that deviates from the auditory background tends to capture attention, which is known to impede cognitive functioning. Such disruption is usually measured using performance on a concurrent visual task. Growing evidence recently showed that the pupillary dilation response (PDR) could index the attentional response triggered by a deviant sound. Given that the pupil diameter is sensitive to several vision-related factors, it is unclear whether the PDR could serve to study attentional capture in such contexts. Hence, the present study aimed at verifying whether the PDR can be used as a proxy for auditory attentional capture while a visual serial recall task (Experiment 1) or a reading comprehension task (Experiment 2) ? respectively producing changes in luminance and gaze position ? is being performed. Results showed that presenting a deviant sound within steady-state standard sounds elicited larger PDRs than a standard sound. Moreover, the magnitude of these PDRs was positively related to the amount of performance disruption produced by deviant sounds in Experiment 1. Performance remained unaffected by the deviants in Experiment 2, thereby implying that the PDR may be a more sensitive attention-capture index than behavioural measures. These results suggest that the PDR can be used to assess attentional capture by a deviant sound in contexts where the pupil diameter can be modulated by the visual environment.

1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard-cite-them-right
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • sv-SE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • de-DE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf