General Regression Neural Network (GRNN)


There are other computational networks than the Feedforward Backpropagation Network, Multi Layer Perceptron (FF, MLP) we used in our previous studies. In a 2006 book ‘Artificial Neural Networks in Finance and Manufacturing’ there is an interesting article that contends the GRNN was better than traditional MLP in their study. We tried it. GRNN is some kind of k-NN algorithm. The neurons represent input samples, therefore if the training set contains 200 samples (as in our case), there are 200 neurons in the hidden layer. Additionally, each neuron is has a Radial Basis Function (RBF), a kernel function that is usually the Gaussian. These neurons have a radius. The training algorithm fine tunes, learns these radiuses as weights. Input that is close to the neuron is given higher weight. The weights of the 200 neurons are summed up to produce the output.

The good thing about the GRNN is that there is less parameters to tinker with.
GRNN is not random. There is no random initial weight initialization. If we repeat the experiment the second time, we got the same result.
the only parameter to tinker is the ‘spread’. In contrast to the MLP, where our parameters are the number of hidden layers, the number of neurons per layer, the learning factor, the number of max. epochs, the validation/train/test sample proportions, etc.
GRNN is not sensitive to output scaling. If we multiply the output by 100, we get the same result. (As we saw in our previous posts, it is not true to the MLP).

More about it for example in this page

1. Changing the ‘spread’ parameter.
For our first test day, the GRNN has to learn this function:

The GRNN surface becomes this for the different spreads of 0.1/0.5/1.0/2.0. The spread = 1.0 is the default behaviour.

2. Back tests

Series 1: test outlier threshold sensitivity:
no outlier limit: D_stat: 51.86%, projectedCAGR: -0.58%, TR: -34.89%
outliers = 4%: D_stat: 52.48%, projectedCAGR: 4.26%, TR: 7.12%
outliers = 3%: D_stat: 52.45%, projectedCAGR: 9.10%, TR: 72.29%
outliers = 2%: D_stat: 52.00%, projectedCAGR: 9.99%, TR: 87.47%
Some TR% charts:
spread = 1, outliers = 4% (TR: 7.12%)

spread = 1, outliers = 3%: (TR: 72.29%)

Series 2: test spread sensitivity
For the outlierFixlimit = 3% case, the spreads
Spread = 2.0: D_stat: 51.71%, projectedCAGR: 1.30%, TR: -20.77%
Spread = 1.0: D_stat: 52.45%, projectedCAGR: 9.10%, TR: 72.29%
Spread = 0.5: D_stat: 50.72%, projectedCAGR: 1.15%, TR: -22.03%

Back tests result:
– GRNN is not sensitive to target normalization (multiplying the target gives the same performance)
spread = 1 is the best. (from spreads = 0.5/1.0/2.0)
– sign(nnTargetNormalized) is not bad, but not excellent, so we ignore it. (D_stat: 52.30%, projectedCAGR: 3.77%, TR: 1.88%)
– outlierFixlimit = 3%, results better TR (70%) than the 4% limit; but outlierFixlimit = 4% results better D_stat (marginally). We will chose outlierFixlimit = 3% in the future ensemble.
– not excluding outliers gave bad result

The GRNN is not better in our forecasting problem than the traditional MLP. It may be superior in other problems.
The TR% and CAGR% measurements were not really good (they can be unlucky), but we value the D_stat% the most, and that is more than 52% directional accuracy. So, we think it is a valuable asset in the ANN arsenal. It is quite similar to k-NN algorithm for small spreads like 0.1.
The GRNN approach and parameter tinkering can be very unlucky (by chance), but because it is deterministic, if we run 20 times, it can be unlucky 20 times. So, we cannot attack this luckiness, unluckiness by running 20 different GRNN experiments. Therefore, because we cannot trust the GRNN approach, we cannot advise to use it as a standalone algorithm. But because it is a different approach than the MLP, in the future we add this kind of predictor as a member of the ensemble with small weight.


One Response to “General Regression Neural Network (GRNN)”

  1. 1 Rui

    What do you think is better for timeseries forecasting MLP or FF ?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: