Archive for October, 2010

In the last post, we have investigated the 1 dimensional input encoding variations and we concluded that we were very lucky with the original encoding when Monday was separated from Friday as a neighbour. However, we prefer approaches that are based less on luck and more robust. In this post, we study the prediction power […]

Consider the abstract from the article “Neural network encoding approach comparison: an empirical study” ” The authors report the results of an empirical study about the effect of input encoding on the performance of a neural network in the classification of numerical data. Two types of encoding schemes were studied, namely numerical encoding and bit […]

There are other computational networks than the Feedforward Backpropagation Network, Multi Layer Perceptron (FF, MLP) we used in our previous studies. In a 2006 book ‘Artificial Neural Networks in Finance and Manufacturing’ there is an interesting article that contends the GRNN was better than traditional MLP in their study. We tried it. GRNN is some […]

In this post, we try different scaling methods for mapping the ANN target. In contrast to the previous post, we try to solve the function approximation task, not the classification task. We predict not only the direction of the next day, but the value as well. In half of the experiments, we used only a […]

In this post we define the task of ANN as a classification task. In contrast to the general function approximation task. The training data contains only the direction of the move. The target is -1 if the next day was a down day and +1 if it was an up day. (zero, at no movement, […]

As we found a serious problem in Matlab implementation of target normalization, we were interested in finding other cases in which the Matlab problem is obvious. Note this one from Harbin Brightman: ” I built a neural network and set net.trainParam.epochs=10; I built another neural network, and set net.trainParam.epochs=1;But I call this neural network […]

In this post we give a warning that the Matlab Neural Network Toolbax developer team made a grievous mistake. They tried to improve the usability, but they hide a potential erroneous problem behind the curtain. In the new Matlab NN Toolbox, the team redesigned the usage of the ANN. This quote is from the official […]