Malignant melanoma, not belongs to a common type of skin cancers but most serious because of its growth—affecting large number of people worldwide. Recent studies proclaimed that risk factors can be substantially reduced by making it almost treatable, if detected at its early stages. This timely detection and classification demand an automated system, though procedure is quite complex. In this article, a novel strategy is adopted, which not only diagnoses the skin cancer but also assigns a proper class label. The proposed technique is principally built on saliency valuation and the selection of most discriminant deep features selection. The lesion contrast is being enhanced using proposed Gaussian method, followed by color space transformation from RGB to HSV. The new color space facilitates the saliency map construction process, utilizing inner and outer disjoint windows, by making the foreground and background maximally differentiable. From the segmented images, deep features are extracted by utilizing inception CNN model on two basic output layers. These extracted set of features are later fused using proposed decision-controlled parallel fusion method, prior to feature selection using proposed window distance-controlled entropy features selection method. The most discriminant features are later subjected to classification step. To demonstrate the efficiency of the proposed methods, three freely available datasets are utilized such as PH2, ISBI 2016, and ISBI 2017 with achieve accuracy is 97.74%, 96.1%, and 97%, respectively. Simulation results clearly reveal the improved performance of proposed method on all three datasets compared to existing methods.

An integrated framework of skin lesion detection and recognition through saliency method and optimal deep neural network features selection

Khan M. A.;Rashid M.
Co-last
;
2020-01-01

Abstract

Malignant melanoma, not belongs to a common type of skin cancers but most serious because of its growth—affecting large number of people worldwide. Recent studies proclaimed that risk factors can be substantially reduced by making it almost treatable, if detected at its early stages. This timely detection and classification demand an automated system, though procedure is quite complex. In this article, a novel strategy is adopted, which not only diagnoses the skin cancer but also assigns a proper class label. The proposed technique is principally built on saliency valuation and the selection of most discriminant deep features selection. The lesion contrast is being enhanced using proposed Gaussian method, followed by color space transformation from RGB to HSV. The new color space facilitates the saliency map construction process, utilizing inner and outer disjoint windows, by making the foreground and background maximally differentiable. From the segmented images, deep features are extracted by utilizing inception CNN model on two basic output layers. These extracted set of features are later fused using proposed decision-controlled parallel fusion method, prior to feature selection using proposed window distance-controlled entropy features selection method. The most discriminant features are later subjected to classification step. To demonstrate the efficiency of the proposed methods, three freely available datasets are utilized such as PH2, ISBI 2016, and ISBI 2017 with achieve accuracy is 97.74%, 96.1%, and 97%, respectively. Simulation results clearly reveal the improved performance of proposed method on all three datasets compared to existing methods.
2020
32
20
15929
15948
CNN features; Fusion; Melanoma; Neural network; Optimal features; Saliency segmentation
Khan M.A.; Akram T.; Sharif M.; Javed K.; Rashid M.; Bukhari S.A.C.
File in questo prodotto:
File Dimensione Formato  
An-integrated-framework-of-skin-lesion-detection-and-recognition-through-saliency-method-and-optimal-deep-neural-network-features-selectionNeural-Computing-and-Applications.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 3.62 MB
Formato Adobe PDF
3.62 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/2078330
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 60
  • ???jsp.display-item.citation.isi??? 65
social impact