Perception‐based image classification

Christopher Henry (Electrical and Computer Engineering, University of Manitoba, Winnipeg, Canada)
James F. Peters (Electrical and Computer Engineering, University of Manitoba, Winnipeg, Canada)

International Journal of Intelligent Computing and Cybernetics

ISSN: 1756-378X

Publication date: 24 August 2010



The purpose of this paper is to present near set theory using the perceptual indiscernibility and tolerance relations, to demonstrate the practical application of near set theory to the image correspondence problem, and to compare this method with existing image similarity measures.


Image‐correspondence methodologies are present in many systems that are depended on daily. In these systems, the discovery of sets of similar objects (aka, tolerance classes) stems from human perception of the objects being classified. This view of perception of image‐correspondence springs directly from Poincaré's work on visual spaces during 1890s and Zeeman's work on tolerance spaces and visual acuity during 1960s. Thus, in solving the image‐correspondence problem, it is important to have systems that accurately model human perception. Near set theory provides a framework for measuring the similarity of digital images (and perceptual objects, in general) based on features that describe them in much the same way that humans perceive objects.


The contribution of this paper is a perception‐based classification of images using near sets.


The method presented in this paper represents a new approach to solving problems in which the goal is to match human perceptual groupings. While the results presented in the paper are based on measuring the resemblance between images, the approach can be applied to any application that can be formulated in terms of sets such that the objects in the sets can be described by feature vectors.



Henry, C. and Peters, J. (2010), "Perception‐based image classification", International Journal of Intelligent Computing and Cybernetics, Vol. 3 No. 3, pp. 410-430.

Download as .RIS



Emerald Group Publishing Limited

Copyright © 2010, Emerald Group Publishing Limited

Please note you might not have access to this content

You may be able to access this content by login via Shibboleth, Open Athens or with your Emerald account.
If you would like to contact us about accessing this content, click the button and fill out the form.
To rent this content from Deepdyve, please click the button.