Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources

Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
https://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2018050217095
Open Access logo originally created by the Public Library of Science (PLoS)
Langanzeige der Metadaten
DC ElementWertSprache
dc.creatorWahn, Basil-
dc.creatorMurali, Supriya-
dc.creatorSinnett, Scott-
dc.creatorKönig, Peter-
dc.date.accessioned2018-05-02T09:43:08Z-
dc.date.available2018-05-02T09:43:08Z-
dc.date.issued2018-05-02T09:43:08Z-
dc.identifier.citationi-Perception 2017. Sage-
dc.identifier.urihttps://osnadocs.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-2018050217095-
dc.description.abstractHumans’ ability to detect relevant sensory information while being engaged in a demanding task is crucial in daily life. Yet, limited attentional resources restrict information processing. To date, it is still debated whether there are distinct pools of attentional resources for each sensory modality and to what extent the process of multisensory integration is dependent on attentional resources. We addressed these two questions using a dual task paradigm. Specifically, participants performed a multiple object tracking task and a detection task either separately or simultaneously. In the detection task, participants were required to detect visual, auditory, or audiovisual stimuli at varying stimulus intensities that were adjusted using a staircase procedure. We found that tasks significantly interfered. However, the interference was about 50% lower when tasks were performed in separate sensory modalities than in the same sensory modality, suggesting that attentional resources are partly shared. Moreover, we found that perceptual sensitivities were significantly improved for audiovisual stimuli relative to unisensory stimuli regardless of whether attentional resources were diverted to the multiple object tracking task or not. Overall, the present study supports the view that attentional resource allocation in multisensory processing is task- dependent and suggests that multisensory benefits are not dependent on attentional resources.eng
dc.relationhttp://journals.sagepub.com/doi/abs/10.1177/2041669516688026-
dc.rightsNamensnennung 3.0 Unported-
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/-
dc.subjectAttentional resourceseng
dc.subjectmutisensory processingeng
dc.subjectvisioneng
dc.subjectauditioneng
dc.subjectmultiple object trackingeng
dc.subjectmultisensory integrationeng
dc.subjectload theoryeng
dc.subject.ddc610 - Medizin, Gesundheit-
dc.subject.ddc150 - Psychologie-
dc.titleAuditory Stimulus Detection Partially Depends on Visuospatial Attentional Resourceseng
dc.typeEinzelbeitrag in einer wissenschaftlichen Zeitschrift [article]ger
dc.identifier.doi10.1177_2041669516688026-
vCard.ORGFB8-
Enthalten in den Sammlungen:FB08 - Hochschulschriften
Open-Access-Publikationsfonds

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
i-perception_2041669516688026_2017_Wahn.pdf502,24 kBAdobe PDF
i-perception_2041669516688026_2017_Wahn.pdf
Miniaturbild
Öffnen/Anzeigen


Diese Ressource wurde unter folgender Copyright-Bestimmung veröffentlicht: Lizenz von Creative Commons Creative Commons