Hyperspectral Unmixing using Deep Learning
MennaTullah Mamdouh El-Kholy;
Abstract
In this chapter, the consistency and stability of the proposed unmixing
methods were experimentally tested using different datasets. The performances of
the proposed unmixing methods were investigated against different factor such as
no. endmembers, loss function, robustness to noise, and speed up rate.
To evaluate the behavior of the proposed linear unmixing method based
deep convolutional autoencoder, different experiments were conducted on both
syntactic and hyperspectral dataset. Experimental results indicate that the proposed
method outperforms the other traditional and deep based unmixing methods. The
results demonstrated the significant performance of the proposed DCAE and that it
outperforms benchmark unmixing methods even in a noisy environment in terms
of both Root Mean Square Error (RMSE) and Mean Square Error (MSE). The
achieved results of the proposed DCAE in terms of mean absolute error were
0.0097, 0.001, 0.0141, and 0.0145 for Samson, Cuprite, Urban, and Jasper Ridge
datasets, respectively.
To evaluate the performance of proposed nonlinear unmixing method based
on deep learning, Different experiments are conducted using syntactic and
hyperspectral datasets to show the effectiveness of the proposed algorithm, and the
performance was evaluated using both SAD and SID metrics. The experiments
evaluated the performance in terms of accuracy assessment, weight initialization
techniques, learning rate, and robustness to the noise. The results verified that the
proposed autoencoder outperforms traditional endmember extraction algorithms in
nonlinear cases. Also, we introduced the application of hyperspectral image
unmixing algorithm on the Internet of things (IoT) environment.
Finally, this chapter evaluates the proposed nonlinear unmixing method for
hyperspectral band selection. Different experiment sets were conducted to study its
98
methods were experimentally tested using different datasets. The performances of
the proposed unmixing methods were investigated against different factor such as
no. endmembers, loss function, robustness to noise, and speed up rate.
To evaluate the behavior of the proposed linear unmixing method based
deep convolutional autoencoder, different experiments were conducted on both
syntactic and hyperspectral dataset. Experimental results indicate that the proposed
method outperforms the other traditional and deep based unmixing methods. The
results demonstrated the significant performance of the proposed DCAE and that it
outperforms benchmark unmixing methods even in a noisy environment in terms
of both Root Mean Square Error (RMSE) and Mean Square Error (MSE). The
achieved results of the proposed DCAE in terms of mean absolute error were
0.0097, 0.001, 0.0141, and 0.0145 for Samson, Cuprite, Urban, and Jasper Ridge
datasets, respectively.
To evaluate the performance of proposed nonlinear unmixing method based
on deep learning, Different experiments are conducted using syntactic and
hyperspectral datasets to show the effectiveness of the proposed algorithm, and the
performance was evaluated using both SAD and SID metrics. The experiments
evaluated the performance in terms of accuracy assessment, weight initialization
techniques, learning rate, and robustness to the noise. The results verified that the
proposed autoencoder outperforms traditional endmember extraction algorithms in
nonlinear cases. Also, we introduced the application of hyperspectral image
unmixing algorithm on the Internet of things (IoT) environment.
Finally, this chapter evaluates the proposed nonlinear unmixing method for
hyperspectral band selection. Different experiment sets were conducted to study its
98
Other data
| Title | Hyperspectral Unmixing using Deep Learning | Other Titles | الفصل الطيفى باستخدام التعلم العميق | Authors | MennaTullah Mamdouh El-Kholy | Issue Date | 2020 |
Attached Files
| File | Size | Format | |
|---|---|---|---|
| BB1022.pdf | 599.45 kB | Adobe PDF | View/Open |
Similar Items from Core Recommender Database
Items in Ain Shams Scholar are protected by copyright, with all rights reserved, unless otherwise indicated.