Texture image classification with discriminative neural networks

Yang Song, Qing Li, Dagan Feng, Ju Jia Zou, Weidong Cai

Research output: Contribution to journalArticlepeer-review

Abstract

Texture provides an important cue for many computer vision applications, and texture image classification has been an active research area over the past years. Recently, deep learning techniques using convolutional neural networks (CNN) have emerged as the state-of-the-art: CNN-based features provide a significant performance improvement over previous handcrafted features. In this study, we demonstrate that we can further improve the discriminative power of CNN-based features and achieve more accurate classification of texture images. In particular, we have designed a discriminative neural network-based feature transformation (NFT) method, with which the CNN-based features are transformed to lower dimensionality descriptors based on an ensemble of neural networks optimized for the classification objective. For evaluation, we used three standard benchmark datasets (KTH-TIPS2, FMD, and DTD) for texture image classification. Our experimental results show enhanced classification performance over the state-of-the-art.
Original languageEnglish
Pages (from-to)367-377
Number of pages11
JournalComputational Visual Media
Volume2
Issue number4
DOIs
Publication statusPublished - 2016

Open Access - Access Right Statement

The Author(s) 2016. Distributed under the terms of the Creative Commons Attribution 4.0 International License (http:// creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Keywords

  • computer vision
  • neural networks (computer science)
  • texture

Fingerprint

Dive into the research topics of 'Texture image classification with discriminative neural networks'. Together they form a unique fingerprint.

Cite this