Adaptive FOA Region Extraction for Saliency-Based Visual Attention

نویسندگان

  • Hyungjik Lee
  • Changseok Bae
  • Janghan Lee
  • Sungwon Sohn
چکیده

Abstract This paper describes an adaptive extraction of focus of attention region for saliency-based visual attention. The saliency map model generates the most salient and significant location in the visual scene. In human brain, there is an inhibition of return property for which current attending point is prevented from being attended again. Therefore, we need to pay attention to the focus of attention and inhibition of return function by employing an appropriate mask for the salient region and shapedbased mask is maybe more suitable than any other masks. On the contrary to the existing fixed-size FOA, we proposed an adaptive and shape-based FOA region according to the most salient region from saliency map. We determine the most salient point by checking every value in saliency map, and expand the neighborhood of the point until the average value of the neighborhood is smaller than 75% value of the most salient point, and then find the contour of the neighborhood. Therefore our adaptive FOA is close to the shape of attended object and it is efficient to the object recognition or other computer vision fields.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Graph-based Visual Saliency Model using Background Color

Visual saliency is a cognitive psychology concept that makes some stimuli of a scene stand out relative to their neighbors and attract our attention. Computing visual saliency is a topic of recent interest. Here, we propose a graph-based method for saliency detection, which contains three stages: pre-processing, initial saliency detection and final saliency detection. The initial saliency map i...

متن کامل

Reduced-Reference Image Quality Assessment based on saliency region extraction

In this paper, a novel saliency theory based RR-IQA metric is introduced. As the human visual system is sensitive to the salient region, evaluating the image quality based on the salient region could increase the accuracy of the algorithm. In order to extract the salient regions, we use blob decomposition (BD) tool as a texture component descriptor. A new method for blob decomposition is propos...

متن کامل

Just Noticeable Difference Estimation Using Visual Saliency in Images

Due to some physiological and physical limitations in the brain and the eye, the human visual system (HVS) is unable to perceive some changes in the visual signal whose range is lower than a certain threshold so-called just-noticeable distortion (JND) threshold. Visual attention (VA) provides a mechanism for selection of particular aspects of a visual scene so as to reduce the computational loa...

متن کامل

Perceptual Object Extraction Based on Saliency and Clustering

Object-based visual attention has received an increasing interest in recent years. Perceptual object is the basic attention unit of object-based visual attention. The definition and extraction of perceptual objects is one of the key technologies in object-based visual attention computation model. A novel perceptual object definition and extraction method is proposed in this paper. Based on Gest...

متن کامل

Compressed-Sampling-Based Image Saliency Detection in the Wavelet Domain

When watching natural scenes, an overwhelming amount of information is delivered to the Human Visual System (HVS). The optic nerve is estimated to receive around 108 bits of information a second. This large amount of information can’t be processed right away through our neural system. Visual attention mechanism enables HVS to spend neural resources efficiently, only on the selected parts of the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012