Improving Convolutional Neural Networks Learning Through Adaptation

Zainab Mohamed Fouad Ibrahim;

Abstract


Recently, Convolution Neural Network (CNN) has accomplished great success in numerous issues of machine learning. Many machine learning methods have been developed for such objectives, for example, Artificial Neural Network (ANN), logistic regression, Support Vector Machine (SVM), deep learning, etc.
Deep learning (specifically CNN) is one of the strategies by which can delude the challenges of the feature extraction process. Usually, deep learning models are capable of extracting the proper features by themselves. Also, deep CNN models are usually designed manually and the key parameters of it are decided by experience and repeated tests which incredibly limit the applications of deep CNN.
Therefore, it is a great challenge to design the proper deep CNN model and reduce the dependence on manual involvement and expertise. So, this thesis will discuss the improvement of the convolution neural network design from different aspects with various methods. The CNN improvements include how to automatically design CNN model without operator intervention, change on convolution or pooling layers, adding some features to save the computational resources, and how to use adaptive and optimized CNN parameters.
On the other side, many hyper-parameters of the CNN can affect the model performance. These parameters are depth of the network, numbers of convolutional layers, numbers of kernels with their sizes. Therefore, it may be a huge challenge to design an appropriate CNN model that uses optimized hyper-parameters.


Other data

Title Improving Convolutional Neural Networks Learning Through Adaptation
Other Titles تحسين تعلم شبكات التوليف العصبية من خلال التكيف
Authors Zainab Mohamed Fouad Ibrahim
Issue Date 2021

Attached Files

File SizeFormat
BB11232.pdf690.1 kBAdobe PDFView/Open
Recommend this item

Similar Items from Core Recommender Database

Google ScholarTM

Check

views 2 in Shams Scholar


Items in Ain Shams Scholar are protected by copyright, with all rights reserved, unless otherwise indicated.