734 0

Weight initialization of neural network by using independent component analysis and restricted random noises

Title
Weight initialization of neural network by using independent component analysis and restricted random noises
Author
Kook Hyun Yoo
Advisor(s)
차경준
Issue Date
2018-02
Publisher
한양대학교
Degree
Doctor
Abstract
Neural network is a machine learning algorithm that has been studied since the mid-1900s, Recently, it has become a popular algorithm for core technique of artificial intelligence. For that neural network, two biggest problems are the prediction accuracy and computation amount of the algorithm. The algorithm has been studied variously to solve a local minimum problem and increase classification accuracy. Also, though the speed of computation is improving due to the development of graphic processing unit, the computation speed of the algorithm itself needs to be improved so far. In order to improve the accuracy of the algorithm and reduce the amount of computation, we propose a method for setting initial weights using independent component analysis and variance restricted random noises addition. The method that the weights of neural network selected by randomly is replaced with the initial weights processed by independent component analysis. When the independent component analysis is conducted in first hidden layer, the nodes in first hidden layer is statistically independent. So it is impossible to estimate the initial parameters of first to second hidden layer by using independent component analysis. To solve this problem, we add random noises to hidden nodes. It also makes data robust. Data robustness is one of the preprocessing methods in image data analysis. Since the data is whitened, the mean of random noises is also 0 so as not to permute the center of data. The degree of the small is calculated statistically based on the assumption of xavier parameter initialization. This algorithm is compared with neural network using traditional random initial weights. Neural network is fitted for Mixed National Institute of Standards and Technology to validate the algorithms. For a fitted neural networks, we calculate the root mean square error, the prediction error for the test data and the number of steps until the weights converges. The t-test and Wilcoxon test are applied to the results that obtained through repetition of neural network. We test the statistical differences between neural networks using random initial weights and neural networks with initial weights ​​set by independent component analysis. This shows that the neural network with the initial weights set by the independent component analysis reaches a better local minimum than the randomly selected neural network and confirms that fewer steps required to parameter convergence.
URI
https://repository.hanyang.ac.kr/handle/20.500.11754/68196http://hanyang.dcollection.net/common/orgView/200000432255
Appears in Collections:
GRADUATE SCHOOL[S](대학원) > MATHEMATICS(수학과) > Theses (Ph.D.)
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE