Classification of clustered microcalcifications using a Shape Cognitron neural network

  • Authors:
  • San-Kan Lee;Pau-choo Chung;Chein-I Chang;Chien-Shun Lo;Tain Lee;Giu-Cheng Hsu;Chin-Wen Yang

  • Affiliations:
  • Department of Radiology, Taichung Veterans General Hospital, VACRS, Taichung 40705, Taiwan, ROC and Department of Diagnostic Radiology, National Defense Medical Center, Taipei 100, Taiwan, ROC;Department of Electrical Engineering, National Cheng Kung University, Tainan 70101, Taiwan, ROC;Remote Sensing Signal and Image Processing Laboratory, Department of Computer Science and Electrical Engineering, University of Maryland Baltimore County, Baltimore, MD;Department of Electrical Engineering, National Cheng Kung University, Tainan 70101, Taiwan, ROC;Department of Radiology, Taichung Veterans General Hospital, VACRS, Taichung 40705, Taiwan, ROC;Radiological Section, Taiwan Adventist Hospital, Taipei 40705, Taiwan, ROC;Computer Center, Taichung Veterans General Hospital, VACRS, Taichung 40705, Taiwan, ROC

  • Venue:
  • Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

A new shape recognition-based neural network built with universal feature planes, called Shape Cognitron (S-Cognitron) is introduced to classify clustered microcalcifications. The architecture of S-Cognitron consists of two modules and an extra layer, called 3D figure layer lies in between. The first module contains a shape orientation layer, built with 20 cell planes of low level universal shape features to convert first-order shape orientations into numeric values, and a complex layer, to extract second-order shape features. The 3D figure layer is a feature extract-display layer that extracts the shape curvatures of an input pattern and displays them as a 3D figure. It is then followed by a second module made up of a feature formation layer and a probabilistic neural network-based classification layer. The system is evaluated by using Nijmegen mammogram database and experimental results show that sensitivity and specificity can reach 86.1 and 74.1%, respectively.