Analysis of Deep Networks with Residual Blocks and Different Activation Functions: Classification of Skin Diseases


GÖÇERİ E.

9th International Conference on Image Processing Theory, Tools and Applications (IPTA), İstanbul, Türkiye, 6 - 09 Kasım 2019 identifier identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/ipta.2019.8936083
  • Basıldığı Şehir: İstanbul
  • Basıldığı Ülke: Türkiye
  • Anahtar Kelimeler: CNN, deep networks, image classification, ReLU, residual block, ResNET, SELU, skin diseases, IMAGE CLASSIFICATION
  • Akdeniz Üniversitesi Adresli: Evet

Özet

Deep convolutional neural networks have been implemented for image classification tasks and achieved promising results in recent years. Particularly, ResNETs have been used since they can eliminate vanishing gradient problem in very deep networks. However, ResNET architectures with different activation functions, batch sizes, number of images in the testing and training stages can cause different results. Therefore, the effect of residual connections and activation functions image classification is still unclear. Also, in the literature, ResNET based models have been trained and tested with data sets having different characteristics. However, to make meaningful evaluations of the results obtained from different ResNET models, the same data sets should be used. Therefore, in this work, four network models have been implemented to analyze the effect of two activation functions (ReLU and SELU) and residual learning for image classification using the same data sets. To evaluate performances of these models, a real world issue, which is automated skin disease classification from colored digital images, has been handled. Experimental results and comparative analyses indicated that the ResNET with SELU and without residual block yields in the highest validation accuracy (97.01%) for image classification.