Unsupervised image segmentation using comparative reasoning and random walks

Anuva Kulkarni, Filipe Condessa, Jelena Kovacevic

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An image segmentation method that does not need training data can provide faster results than methods using complex optimization. Motivated by this idea, we present an unsupervised image segmentation method that combines comparative reasoning with graph-based clustering. Comparative reasoning enables fast similarity search on the image, and these search results are used with the Random Walks algorithm, which is used for clustering and calculating class probabilities. Our method is validated on diverse image modalities such as biomedical images, natural images and texture images. The performance of the method is measured through cluster purity based on available ground truth. Our results are compared to existing segmentation methods using Global Consistency Error scores.

Original languageEnglish (US)
Title of host publication2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages338-342
Number of pages5
ISBN (Electronic)9781479975914
DOIs
StatePublished - Feb 23 2016
EventIEEE Global Conference on Signal and Information Processing, GlobalSIP 2015 - Orlando, United States
Duration: Dec 13 2015Dec 16 2015

Publication series

Name2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015

Other

OtherIEEE Global Conference on Signal and Information Processing, GlobalSIP 2015
CountryUnited States
CityOrlando
Period12/13/1512/16/15

Keywords

  • Random Walks
  • Unsupervised image segmentation
  • Winner Take All hash
  • comparative reasoning
  • hashing

ASJC Scopus subject areas

  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Unsupervised image segmentation using comparative reasoning and random walks'. Together they form a unique fingerprint.

Cite this