How do people find pairs?
- PMID: 36951742
- DOI: 10.1037/xge0001390
How do people find pairs?
Abstract
Humans continuously scan their visual environment for relevant information. Such visual search behavior has typically been studied with tasks in which the search goal is constant and well-defined, requiring relatively little interplay between memory and orienting. Here we studied a situation in which the target is not known in advance, and instead, memory needs to be dynamically updated during the actual search. Observers compared two simultaneously presented arrays of objects for any matching pair of items-a task that requires continuous comparisons between what is seen now and what was seen a few moments ago. To manipulate the balance between memorizing and scanning, we ran two versions of the task. In an eye-tracking version, the objects were continuously available and could be scanned with relative ease. The results suggested that observers preferred scanning over memorizing. In a mouse-tracking version, perceptual availability was limited, and scanning was slowed. Now observers substantially increased their memory use. Thus, the results revealed a flexible and dynamic interplay between memory and perception. The findings aid in further bridging the research fields of attention and memory. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Similar articles
-
Hybrid foraging search in younger and older age.Psychol Aging. 2019 Sep;34(6):805-820. doi: 10.1037/pag0000387. Epub 2019 Aug 15. Psychol Aging. 2019. PMID: 31414857 Free PMC article.
-
The visual arrays task: Visual storage capacity or attention control?J Exp Psychol Gen. 2021 Dec;150(12):2525-2551. doi: 10.1037/xge0001048. Epub 2021 Sep 30. J Exp Psychol Gen. 2021. PMID: 34591545 Free PMC article.
-
Visual search within working memory.J Exp Psychol Gen. 2019 Oct;148(10):1688-1700. doi: 10.1037/xge0000555. Epub 2019 Jan 21. J Exp Psychol Gen. 2019. PMID: 30667264
-
Tuning the ensemble: Incidental skewing of the perceptual average through memory-driven selection.J Exp Psychol Hum Percept Perform. 2021 May;47(5):648-661. doi: 10.1037/xhp0000907. Epub 2021 Mar 15. J Exp Psychol Hum Percept Perform. 2021. PMID: 33719468
-
Studying visual attention using the multiple object tracking paradigm: A tutorial review.Atten Percept Psychophys. 2017 Jul;79(5):1255-1274. doi: 10.3758/s13414-017-1338-1. Atten Percept Psychophys. 2017. PMID: 28584953 Review.
Cited by
-
Peripheral vision contributes to implicit attentional learning: Findings from the "mouse-eye" paradigm.Atten Percept Psychophys. 2024 Jun 5. doi: 10.3758/s13414-024-02907-5. Online ahead of print. Atten Percept Psychophys. 2024. PMID: 38839714
-
Don't hide the instruction manual: A dynamic trade-off between using internal and external templates during visual search.J Vis. 2023 Jul 3;23(7):14. doi: 10.1167/jov.23.7.14. J Vis. 2023. PMID: 37486300 Free PMC article.
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous