Input encoding: 5 dimensional variation


In the last post, we have investigated the 1 dimensional input encoding variations and we concluded that we were very lucky with the original encoding when Monday was separated from Friday as a neighbour. However, we prefer approaches that are based less on luck and more robust. In this post, we study the prediction power in case of the following encoding.
Every input is 5 dimensional.
Monday: [1, 0, 0, 0, 0]
Tuesday: [0, 1, 0, 0, 0]
Wednesday: [0, 0, 1, 0, 0]
Thursday: [0, 0, 0, 1, 0]
Friday: [0, 0, 0, 0, 1]

In hindsight, I prefer the -1..+1 encoding instead of the 0..+1 encoding, but because we trust the Matlab newff() to use the default ‘mapminmax’ input preprocessing to map the input to -1..+1, it doesn’t really matter how we select our input range. (The output range does matter, but that is another story).

We run 2 different tests. One with a standalone ANN predictor:

The other with the ensemble method in which the ensemble contains 10 members.
The ensemble uses the Sum(sign()) instead of the Avg() of the members as in our previous tests.

forecast = sum(sign(standaloneForecasts));

1. The D_stat directional accuracy (51.5%) is not as good as in our best 1 dimensional encoding case (53%), but suprisingly the CAGR (Commulative Annual Growth Rate)=12.6% and TR (Total Return)=135% are quite pleasant. For the nEnsembleMembers = 10, nEpoch = 6 case, here is the Total Return chart.

Not bad. The volatility is not too high.

We consider this test to be successful. The 5 dimensional encoding is better than the average 1 dimensional case (but not better than the best 1 dimensional encoding). Therefore, we would like to use the 5 dimensional encoding in the future. It is based less on luck and require less parameter fine tuning (encodingTypeModulus = 0, 1,2, 3, 4 parameter can be omitted in the future test)

What is the explanation of this better ‘overall performance’? Let’s try to visualize the encoding in the 2 cases. It is difficult, because we cannot visualize a 5 dimensional function. F(x) = y, in which x is 5 dimensional, y is 1 dimensional. At first, let’s inspect the 1 dimensional case. Suppose we start to approximate the function in the first image. I illustrated a solution with a red line.

Compared to this, try to approximate a function in which Friday is a huge up day. A likely solution is the red line:

The huge up days of Friday doesn’t modify the Monday forecast, because Monday is not the neighbour of Friday. However, it increases the forecasted Thursday %gain, because Thursday is a neighbour of Friday in this encoding scheme.
Let’s see what happens in 5 dimensions. Actually, plot only the 2 dimensional subspace of it:

The red rectangles represent the 1st forecast case: the small Friday version. The green line is the first forecast line. However, when Friday went up, in the blue rectangle case, the approximated blue line is increased, but it is higher not only for the Friday point, but for the Monday point too.
This happens because our approximation function has some constrains. It has 2 neurons, 2 weight parameters + the bias, that is 3 parameters to synthesize and we are in a 5 dimensional space. So, the function approximation is not a straight, linear line, but a curve. (which is not illustrated well here.)
In the 5 dimensional space, the Friday becomes the dimensional neigbour of Monday.

Condiser even more detail: moving from 1 dimension to 5 dimensions completely redesigned the ‘neighbour’ relationship. While in the 1 dimensional space Wednesday was 2 steps away from Friday (indirect neigbour), in the 5 dimensional space, everyone is direct neighbour to everyone. That is a very strange, new concept. Note that Friday is now neigbour to even Wednesday and Tuesday.
Is it good or bad? We don’t know. It is different. Probably it depends on the specific task.
For example in 2010 October bull market, all the 10 pushups of the market happened to be on Friday. In the 5 dimensional case, this elevates the forecast for all the days: Monday…Friday, giving an overall bullishness into the prediction, while in the 1 dimensional case it only elevates the forecast of Thursday, Friday.
In an opposite, bearish, market, the same can happen. If all the selling happens to be on Mondays (bearish Monday), it will give an overall bearishness to all days in the 5dimensional case, while it will decrease the forecast of only the Monday, Tuesday in the 1 dimensional case.

The tests show that for the standalone case the number of epoch = 5 is quite good, and for the ensemble case we will pick the nEpoch to be 5 as well. We will chose this configuration in the future.

We found that there is less randomness, less variance in the 5 dimensional case than in the 1 dimensional case.
See this image from the previous post:

You can see in the performance images. In the 1 dimensional case, nEpoch=4. nEnsemble =10, Modulus = 0, the D_stat varied between 52.36% to 53.53% when we run the same experiment multiple times due to randomness. However, in the 5 dimensional (nEpoch=4. nEnsemble =10) case, the D_stat is quite stable from 51.24% to 51.54.
The same can be observed for TR. In the 1 dimensional case, even with 10 ensemble members, nEpoch=4, Modulus=0, we could see TR= (123%, 436%, 211%, 250%) in the same tests. With the 5 dimensional case the TR is quite stable at TR=(134%, 138%, 137%, 149%).
Overall, we decide that the 5 dimensional case is more stable. In spite of decreasing our previously best CAGR% = 16.30% (1 dimensional, Modulus = 0) case, we would like to move to a less profitable, CAGR% = 12.68%, but 5 dimensional input in the future.
It doesn’t mean we completely drop the 1 dimensional input. It can be very handy for visualizing data or testing new ideas and understanding it. So probably in future debugging scenarios, visualization, we would like the 1 dimensional case, but in production environment, we apply the 5 dimensional input.


No Responses Yet to “Input encoding: 5 dimensional variation”

  1. Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: