Build Neural Network Indicator in MT4 using Neuroshell

quentin you could go to the noxa analytics site and download their tutorial of entropy based indicators.

I assure you they do not make any misuse of the word entropy. If I may do a misuse because of lack of understanding they for sure don't.

So basically my idea was to reverse engineer what they did on MT4. What I have is the binary entropy but to me it is sexy enough.

feci, quod potui, et faciant meliora potentes (I did what I can do, those who are more able may do better)

And doing so by MT4 other may use others learners in order to benefit from that input and measure what comes to their mind.

I see two visual patterns:

1. When there is a sharp drop of the entropy look for a trading break - out opportunity.
2. Where the entropy is too high, a spike, look for a black noise reversal opportunity.
 
Last edited:
I see two visual patterns:

1. When there is a sharp drop of the entropy look for a trading break - out opportunity.
2. Where the entropy is too high, a spike, look for a black noise reversal opportunity.
Aha. But you can replace the word 'entropy' with 'range' or 'volatility' and it will still retain the same function and directly correlated to the trading vector operation. Proper range measurement can exploit the raw HLC data and there's plenty of information withheld by H, L, and C values before everybody starts grinding and averaging them down to useless singular value (e.g. ATR(x)).

'Drop' is when the value is below threshold defined through statistical analysis of the volatility data per time interval in units of sigma/stddev. 'Too high' is the same thing just on the opposite side of the distribution. These are robust & universal measures. Mark the thresholds and you've partition the market states into 3 classes: 2 with trading plans and 1 to hold. Volatility in the extremes will revert to the magnitude in the last classification carrying along different behavior while it is undergoing the mean-reverting transition. The question becomes: what makes any other indicator method of measurement more special? The old school methods are actually sexy if properly understood in its application. Just a friendly reminder. :)
 
Thanks for the answer. I am interested you say you prefer information from the bars (High, Open, Low, Close). In my studies I have read that simplifying the data makes it better: SSA, Wavelets, ICA etc. I made my tests and indeed when I use a SSA it makes a better prediction. However it depends of course what you want to predict.

I am focused a lot on volatility prediction. There are some patterns. I would not entirely rely on a neural net for directionnal bias.

In fact I did with friends an expert based on this pattern.
-transition of fractal dimension (iVAR indicator) from one level to a lower level.

When this was identified an entry is executed with directional algorithm. Actually I used a mod of the polarized fractal efficiency (for the name of course i wanted all fractal lol).

I wanted this expert to test if really the drop of fractal dimension offers some edge. I was able to fund some robust parameters because I used the common optimizer and I have optimized just two parameters.

I will upload it very shortly at home.
 
quentin123;1699612 'Drop' is when the value is below threshold defined through statistical analysis of the volatility data per time interval in units of sigma/stddev. 'Too high' is the same thing just on the opposite side of the distribution. These are robust & universal measures. Mark the thresholds and you've partition the market states into 3 classes: 2 with trading plans and 1 to hold. Volatility in the extremes will revert to the magnitude in the last classification carrying along different behavior while it is undergoing the mean-reverting transition. The question becomes: what makes any other indicator method of measurement more special? The old school methods are actually sexy if properly understood in its application. Just a friendly reminder. :)[/QUOTE said:
Yes, in my opinion entropy does very similar job to e.g. Bollinger Bands or Ehler Fisher indicator, it is just another pre processing of data with hope that it will extract some
useful info.

So can you answer my question from previous mail i.e. point to some document which will prove that entropy is useful for trading ?? Because BB or Fisher, yes they are useful but only sometimes like all other technical indicators and when you will evaluate them properly, systems using them wont be profitable on big enough number of trades which removes 'luck' component.

Krzysztof
 
A presentation on Low Entropy trading by Christopher Zapart:
http://www.unifr.ch/econophysics/papers/201011/7080368565.pdf

Love it !! So do you buy it ???

See slide 'Lucky trading'

1) 59 positions - not enough to remove 'luck component'

2) In order to properly evaluate trading system, the full confusion matrix of trading system must be created and ROC curve drawn and other evaluators like e.g. Kappa statistic calculated, specifying of % profitable trades is not enough, it is just analysis of positive examples...but that's the way how trading systems are evaluated and presented.In this way is much easier to get better results....


Krzysztof
 
So can you answer my question from previous mail i.e. point to some document which will prove that entropy is useful for trading ?? Because BB or Fisher, yes they are useful but only sometimes like all other technical indicators and when you will evaluate them properly, systems using them wont be profitable on big enough number of trades which removes 'luck' component.
I'm not qualified for that. My stand on entropy, as I've implied in my posts, is that it's just a fancy word for event information measure. We use it all the time but did not give it a name.

There's only so many ways to extract information out of a data set and useful to a relative resolution of the data. So far I do not see significantly more information gained through other than classic statistical tools when applied properly. At least not to the applicable resolution I know. Speaking of data resolution, certainly smoothing data and then extrapolating info from it, to me, is like putting an animal in a blender and then try to figure out what it was from the juice. If you're not convinced, try taking 1.5 sinusoid cycle and chop HLC sample it to 8 discrete intervals. No difference than being drowned in a lake with 'average' depth of 3ft. Which information is it that matters? How many data points do we really need?

Sidenote: I think Highfreq has thrown a lot of useful clues here. I hope many realizes that. He has certainly opened my eyes. I, for one, am very grateful for this.
 
Last edited:
A presentation on Low Entropy trading by Christopher Zapart:
http://www.unifr.ch/econophysics/papers/201011/7080368565.pdf

look at the slide titled 'Low entropy trading' he says

'No High Frequency Trading'

why ?? By down sampling e.g. from 1 min to daily so by factor of 1440 we are losing a lot of information !!! Or maybe because is easier to backward curve fit the strategy to lower number of samples ???

and the next slide with Nikkei and equity curve.

Its clear that for this period market was ranging so any oscillator based strategy on daily charts should work there.

But if we would analyze the same period on 1 min charts than the story would be very different - to catch the range cycle, observation window had to be very big for such cycle so for 'normal size' windows this market would be as zig zag trends. So mean reversion strategy would not work....tricky....tricky.....time frame, strategy type and market seem to be 'miracleusly' matching !!!!

Krzysztof
 
Hi,

I do not found yet for Chaos Hunter.

Some people may get problem to install NS2 within Windows 7 x64 and x32, but I did.

There are two methods:
1. Install virtual machine, you can download it freely from microsoft then install windows XP. Afterward you can install NS2 setup.

2. You can copy directly extracted files from windows XP into your windows 7 (either x32 and x64)
neuroshell folder (c:\Neuroshell 2\) , copy also all dll and ocx files required into c:\neuroshell 2 folder. Register all of them using command.com, regsvr32
example: regsvr32 /s COMDLG32.ocx
or
regsvr32 /s asycfilt.dll

Here are the dll list you need:
asycfilt.dll
comcat.dll
comdlg32.dll
ctl3d32.dll
EZGA32.DLL
GAPI32.DLL
Gswag32.dll
GSWDLL32.DLL
hlinkprx.dll
Implode.dll
INETWH16.DLL
inetwh32.dll
inloader.dll
mfc40.dll
MFC42ENU.DLL
MFCANS32.DLL
MSFL651.DLL
MSSTKPRP.DLL
msvbvm50.dll
msvcrt40.dll
NS2-32.DLL
NSGA32.DLL
NSHELL2.DLL
NSTGA32.DLL
NSTOMG32.DLL
NSTRD-TS.DLL
NSTRDAUT.DLL
OC25.DLL
Oc30.dll
OLE2CONV.DLL
OLE2PROX.DLL
oleaut32.dll
RHMMPLAY.DLL
SCP.DLL
TPROP32.DLL
TPROP321.DLL
TRADES.DLL
VB40016.DLL
VB5DB.DLL
vb5stkit.dll
vbar332.dll
VCFIDL32.DLL
VCFIWZ32.DLL
_ISREG32.DLL

And here OCX files you need:
COMDLG32.ocx
Dblist32.ocx
GRAPH32.OCX
Mscomm32.ocx
MSFLXGRD.OCX
picclp32.ocx
SPIN32.OCX
ss32x25.ocx
ssa3d30.ocx
SSDOCK32.OCX
SSFORM32.OCX
tabctl32.ocx
THREED32.OCX
VCFI32.OCX
Vsocx32.ocx

You can also create a batch file and run in DOS to register all of them. See attached pictures to shown that NS2 run directly on my windows 7 x64.

Good luck
 

Attachments

  • NS2 run in Windows 7 x64.png
    NS2 run in Windows 7 x64.png
    157.3 KB · Views: 1,177
In order you can open the help file in NS2 when running it on Windows 7, you should download winhlp32.exe from microsoft (or you can found somewhere else)
 
Here is a simple tutorial about NS2 made by someone

ns2.avi - 4shared.com - file sharing - download movie file

Good luck

I believe the NSDT and NS2 are suffering from a few major drawbacks which disallow them to be efficient with current markets.

1) Algorithms used in both NS2 and NSDT are very much outdated, they were developed in mid 90 and it was a stone era for Machine Learning. From my investigations during last year I can say then this area developed a lot and sadly nothing of modern things is developed as an indicators.

2) The learning algorithm used be Neuroshell (using GO) is well known to have tendency to overtfit, currently completelly different algos are considered to be the the best.

3) the size of training set is very limited . My current system is using like 100k bars and 170 or 653 inputs and I can train it within few minutes to few hours.Try it with NS2/NSDT...

4)Scaling of data for real time. NSDT scals the data automatically for NN predictions
for training set and I wonder how they do this for unseen data ?? How NS2 does it ??

I think much more powerfull combination can be Rapid Miner + MT4, i think interface between them is already developed, not mentioning Matlab of course....

That's my opinion, of course some other people can think different...

Krzysztof
 
look at the slide titled 'Low entropy trading' he says

'No High Frequency Trading'

why ?? By down sampling e.g. from 1 min to daily so by factor of 1440 we are losing a lot of information !!! Or maybe because is easier to backward curve fit the strategy to lower number of samples ???

and the next slide with Nikkei and equity curve.

Its clear that for this period market was ranging so any oscillator based strategy on daily charts should work there.

But if we would analyze the same period on 1 min charts than the story would be very different - to catch the range cycle, observation window had to be very big for such cycle so for 'normal size' windows this market would be as zig zag trends. So mean reversion strategy would not work....tricky....tricky.....time frame, strategy type and market seem to be 'miracleusly' matching !!!!

Krzysztof

I think you are right. There is calibration going on here. But this is very normal and the author acknowledges it. You don’t expect papers describing the ultimate solution being published for free. Do you? If such thing exists indeed but let me doubt it does.

One solution that serves me well is to identify a bunch of these systems that generate equity for some time and switch between them when appropriate. Very much like Watson did in Jeopardy. Build a bag of many strategies and wrap them on the fly in a meta-algorithm. Treat them as if they were part of a dynamic asset allocation problem.
 
I think you are right. There is calibration going on here. But this is very normal and the author acknowledges it. You don’t expect papers describing the ultimate solution being published for free. Do you? If such thing exists indeed but let me doubt it does.

One solution that serves me well is to identify a bunch of these systems that generate equity for some time and switch between them when appropriate. Very much like Watson did in Jeopardy. Build a bag of many strategies and wrap them on the fly in a meta-algorithm. Treat them as if they were part of a dynamic asset allocation problem.

He is not doing it for free !!! I think author of this papers is working or worked in
The Institute of Statistical Mathematics so he is taking salary for generating
research papers....

The Institute of Statistical Mathematics

Regarding your meta algorithm. Is it classic meta algorithm like e.g 'voted classifier'
or your own solution like e.g. 'if equity curve is raising than I play this strategy' ??

Personally I don't see any reason why e.g. second solution should be superior than just one strategy.

Trading strategy is just a filter which take input data and generate buy/sell signals. Then bunch of strategies will be filter bank, each filter(strategy) supouse to find profitable patterns in the market data and generate profitable trades.

But in this case just one strategy which will cover all patterns possible to recognize by strategies bank, should be the best because you will avoid switching problem which for sure introduces noise to whole process (e.g due to lag) .

Krzysztof
 
He is not doing it for free !!! I think author of this papers is working or worked in
The Institute of Statistical Mathematics so he is taking salary for generating
research papers....

The Institute of Statistical Mathematics

Regarding your meta algorithm. Is it classic meta algorithm like e.g 'voted classifier'
or your own solution like e.g. 'if equity curve is raising than I play this strategy' ??

Personally I don't see any reason why e.g. second solution should be superior than just one strategy.

Trading strategy is just a filter which take input data and generate buy/sell signals. Then bunch of strategies will be filter bank, each filter(strategy) supouse to find profitable patterns in the market data and generate profitable trades.

But in this case just one strategy which will cover all patterns possible to recognize by strategies bank, should be the best because you will avoid switching problem which for sure introduces noise to whole process (e.g due to lag) .

Krzysztof

The switching is taken care by an Ensemble Learning method. There is no lag induced.
Ensemble learning - Wikipedia, the free encyclopedia

Equity is almost as good as the best model in hindsight in the bag. The low bound of being wrong is guaranteed which is not the case with single models.
 
Top