|dc.description.abstract||In this review, the current status of search studies in both the visual and haptic modality is discussed, with an aim to assess the similarity of the processes involved. While recent models of visual search propose differing mechanisms explaining experimental results, it is generally agreed on that the search process consists of a pre-attentive parallel stage and an attentive serial stage. During the pre-attentive stage, basic features are examined and the information is used to guide attention towards salient items during the attentive stage, improving the efficiency of the search process. Using such models, average response times can be predicted accurately in a wide range of experimental settings. However, modeled response time distributions often do not correspond to experimental data, indicating that many current models of visual search are incomplete. Other points of discussion still remain, such as the cause of asymmetric behaviours with symmetric task designs, and the exact mechanism of top-down control.
The haptic modality of search has been studied less extensively than the visual one. Haptic experiments exhibit behaviours similar to visual search, with search efficiency dependent on target and distractor features in the same way as described for visual search. Discrepancies observed between visual and haptic search are often due to suboptimally matched stimuli between the two modalities. Due to the differences in feature dimensions, spatial constrains and experimental methods, such comparisons are difficult, and results of recent studies are not conclusive. There are, however, indications that search data are at least in part exchangeable between visual and haptic processes.||