FX Correlations

shanemr

Junior member
Messages
12
Likes
0
Does anyone know of any indicators that have a high correlation with the USD/EUR FX?

I know that interest rate differentials and other currency pairs have high correlations. But can anyone think of any others?

Regards,

Shane
 
Does anyone know of any indicators that have a high correlation with the USD/EUR FX?

I know that interest rate differentials and other currency pairs have high correlations. But can anyone think of any others?

Regards,

Shane

You mean EURUSD FX of course. Yes, try gold.
 
Timeframe is somewhat important in this discussion as during short-term timeframes the linkages can come and go based on a whole set of factors.

Also, keep in mind that the correllations can shift over time. Gold has worked quite well recently, but there have been plenty of times when the markets haven't shown any linkage at all.
 
Both excellent ideas thanks.

I am a researcher using genetic algorithms to find relationships with the EUR/USD FX. So I am trying to find new inputs to help.

If you think of any others please tell me.

Regards,

Shane
 
Shane,

I've heard of genetic algorithms but am in complete igorance of them. Perhaps you could briefly explain what they are; and on what basis they are applied to the finacial markets (underlying assumptions).

This would be really appreciated.

Grant.
 
Genetic algorithms are a very large area. But basically they are algorithms that have been inspired by nature. Typically these methods use the concepts of mating, selection, crossover, mutation etc.

What I am doing is evolving a population of neural networks (a technology inspired by the human brain). Initially they are very simple consisting of a very limited number of neurons. I evaluate how well they perform at a given task. In this case it is predicting the next days FX rate, by running historical data through them and asking them to predict the next value in the sequence. Then I rank them and use evolutionary/genetic principles to create a new population of neural nets. Essentially I keep the strong and mate them to produce new individuals and kill the weak. Darwin and Hitler would have loved this technology ;) After many passes through the historical data and much mating and evolving, the networks grow in complexity and knowledge about the data set. Eventually they see the reoccurring patterns in the data and become better and better at predicting what the next days FX rate will be.

I am not the first in this area. Others have ventured into this space before though I have my own novel approaches and tricks that I have used to improve upon the research of my predecessors. So I am generating very promising but entirely as yet untested results.

If you have any other questions feel free to ask I am will endeavor to answer them.

Regards,

Shane
 
Eventually they see the reoccurring patterns in the data and become better and better at predicting what the next days FX rate will be.
Shane

How do you know the genetic algorithm is not doing just a linear fit to the data and a simple extrapolation of the future values?

I mean assuming the human brain is the best tool for trading the markets is a fallacy since we know most traders lose money and I cannot claim these people are stupid, it's just the fact that markets are so non-linear that the human brain cannot comprehend them.

Alex
 
Hi Alexander,

To prevent simple linear extrapolation I pre-process the data that I am feeding into the network to remove the simple linear components. This leaves the more difficult to predict (presumably) non-linear components of the time series. Then, after I am done my prediction, on the remaining residual non-linear components I put the two parts back together to get my final prediction.

I do this because the prediction of linear components is very easy and fast and because the non-linear predictors I am using are very bad at predicting the linear components of a time series.

Hope that answers some your questions.

Regards,

Shane
 
Shane,

Thanks for the clarification.

You say the neurons are ranked and I assume the inputs/questions are fixed or constant. This being the case, couldn't a current weaker neuron change its ranking because the inputs have become more immediate or relevant? Therefore, wouldn't a constant self-testing and self-readjustment be advantageous? By way of analogy, 5 and 10-day ma's may work one day but due to a change in volatility 4 and 9-day may work better.

Further, how do you assess the validity of inputs - isn’t there an almost infinite range of possibilities? Again, what is relevant at one point could become irrelevant the next.

Without wishing to sound negative (it is really a philosophical question), and possibly echoing Alex’s contention, isn’t it the case that this is just another example of assuming a specific complex system may be relevant to another but sharing no discernible characteristics or common features? In simple terms, looking for order (predictability) where none is apparent because we have difficulty in reconciling a lack of structure with our rational minds?

Grant
 
Hi Alexander,

To prevent simple linear extrapolation I pre-process the data that I am feeding into the network to remove the simple linear components. This leaves the more difficult to predict (presumably) non-linear components of the time series. Then, after I am done my prediction, on the remaining residual non-linear components I put the two parts back together to get my final prediction.

Shane


By pre-processing the data to remove the linear components aren't you altering the dynamics of the system? It is possibly the conjunction of linear and non-linear components that produces future movements. For example, consider the equation of a non-linear spring-mass system :

F = ma+kx^2

where m is the mass, a is the acceleration, k the spring constant and x the position.

the spring force depends on the square of the displacement x instead of just x, which is the case with linear springs.

If you remove the linear term ma by considering a massless system (m = 0) the solution of the non-linear part is trivial and you can easily predict that for a given F, x = sqrt(F/k).

On the other hand, the solution of F = ma for constant F and m, with 0 initial conditions is: x = Ft^2/2m

Now, if you base your prediction on two different systems and add the solution space you get:

x = Ft^2/2m + sqrt(F/k)

which is NOT the solution of the original system and it cannot predict x.

In another sense, you assume you can decompose a time series with respect to prediction but that is not possible even for the simplest of systems. You can only do that (if I remember correctly), only if there exists a canonical transformation in generalized coordinates. (it's been a long time so excuse me if I'm wrong with this one).

Alex
 
grantx,

Well it is true that the ranking of networks may change as the inputs change. I calculate the fitness over a long series of input patterns. So the nets have to be very dynamic and generalize well to obtain a high fitness over a large number of patterns. Presenting many patterns is also an attempt to stop the networks from simply memorizing the sequence of values. As the market changes they should be able to respond dynamically. As oppossed to more simple linear systems which can't deal with sudden or large market shifts.

The other thing I do is retain the population of networks (not just the best performer) and make sure that the population retains quite a bit of variety in topology. In other words I make sure the population retains a variety of approaches to approximating a solution to the prediction problem. This means that each day as the dynamics of the market change I can do another evolutionary iteration and perhaps pick out a new member from the population that is now the superior predictor.

Perhaps I am looking for something that isn't there. All I know is that my system appears to correctly anticipate trends and new shifts in market direction more times than it incorrectly anticipates them. But maybe there is something wrong with my analysis and I am fooling myself. The real test of course will be trading using my system.

In a couple of months, when I am more confident in my analysis tools, would any of you perhaps be interested in taking a look at the output of my system? It would too premature at this point to have anybody validate my system. But once I am happy with it. I would like some traders to look at the output and give me feedback on possible pitfalls in my analysis.

Regards,

Shane

Shane,

Thanks for the clarification.

You say the neurons are ranked and I assume the inputs/questions are fixed or constant. This being the case, couldn't a current weaker neuron change its ranking because the inputs have become more immediate or relevant? Therefore, wouldn't a constant self-testing and self-readjustment be advantageous? By way of analogy, 5 and 10-day ma's may work one day but due to a change in volatility 4 and 9-day may work better.

Further, how do you assess the validity of inputs - isn’t there an almost infinite range of possibilities? Again, what is relevant at one point could become irrelevant the next.

Without wishing to sound negative (it is really a philosophical question), and possibly echoing Alex’s contention, isn’t it the case that this is just another example of assuming a specific complex system may be relevant to another but sharing no discernible characteristics or common features? In simple terms, looking for order (predictability) where none is apparent because we have difficulty in reconciling a lack of structure with our rational minds?

Grant
 
Hi Alexander,

The thing is I am approximating a solution to the system. So my approach is analogous to a taylor expansion rather than a algebraic rewriting of the equations. In this analogy the linear term is one of the terms in the Taylor expansion and the output of the neural network is another term. Finally because I don't continue this expansion out to infinity there will always be an difference between the real behaviour of the system an my prediction. But hopefully it is small :)

Regards,

Shane

By pre-processing the data to remove the linear components aren't you altering the dynamics of the system? It is possibly the conjunction of linear and non-linear components that produces future movements. For example, consider the equation of a non-linear spring-mass system :

F = ma+kx^2

where m is the mass, a is the acceleration, k the spring constant and x the position.

the spring force depends on the square of the displacement x instead of just x, which is the case with linear springs.

If you remove the linear term ma by considering a massless system (m = 0) the solution of the non-linear part is trivial and you can easily predict that for a given F, x = sqrt(F/k).

On the other hand, the solution of F = ma for constant F and m, with 0 initial conditions is: x = Ft^2/2m

Now, if you base your prediction on two different systems and add the solution space you get:

x = Ft^2/2m + sqrt(F/k)

which is NOT the solution of the original system and it cannot predict x.

In another sense, you assume you can decompose a time series with respect to prediction but that is not possible even for the simplest of systems. You can only do that (if I remember correctly), only if there exists a canonical transformation in generalized coordinates. (it's been a long time so excuse me if I'm wrong with this one).

Alex
 
Hi Alexander,

The thing is I am approximating a solution to the system. So my approach is analogous to a taylor expansion rather than a algebraic rewriting of the equations.
Shane

Taylor series expansion is only valid at a point on a curve. For example, given an arbitrary curve, like a price-yield relationaship, the first two terms of the expansion are the slope (duration) and convexity. These quantities change along the curve so it does not make sense to talk about a liner and a non-linear component of a data set.

If what you saying is true, every arbitrary curve could be decomposed to a straight line and another curve.

Ultimately, I think what you are doing IMO is detrending the data and then using the noise portion to make predictions.

Alex
 
Alex,

Yes exactly. I was working too hard to come up with an appropriate analogy :) But you are right. To a certain degree that is exactly what I am doing.

Shane

Taylor series expansion is only valid at a point on a curve. For example, given an arbitrary curve, like a price-yield relationaship, the first two terms of the expansion are the slope (duration) and convexity. These quantities change along the curve so it does not make sense to talk about a liner and a non-linear component of a data set.

If what you saying is true, every arbitrary curve could be decomposed to a straight line and another curve.

Ultimately, I think what you are doing IMO is detrending the data and then using the noise portion to make predictions.

Alex
 
Shane ,

Thank you once more for the clarification.

"each day as the dynamics of the market change I can do another evolutionary iteration and perhaps pick out a new member from the population that is now the superior predictor".

Presumably, the ideal would be a real-time model. Is this feasible or would it take enormous computing power? The shorter the forecasting period period, the less variation and the more accurate the forecast?

"would any of you perhaps be interested in taking a look at the output of my system?". That's simple - just post your predictions. I would advise letting anyone see/know your system; they may nick it.

Good luck,

Grant.
 
Grant,

Yes, real-time predictions are the ultimate goal. The neural network that performs the prediction runs quite fast. The computationally intensive task is evolving the networks in the first place. But this can be sped up using a cluster of cheap computers. There is also an adaptation of some of the base technology that I am using that would allow real time evolution. However, I am not sure if there is a hit in the accuracy of prediction. I haven't actually used this technology yet.

I also have some experience developing real time trading systems. In a former life, I built real time, high volume order management and market data systems. So I would understand some of the issues that would crop up in building a real time version of my analysis.

Until I produce the analysis I can't say conclusively whether the predictions will become more accurate for a shorter time period. The consensus amongst those I've talked to in my field is that the accuracy will likely improve with higher frequency sampling.

Stay tuned to this thread in (hopefully less than) a couple of months I will post some sample predictions.

Regards,

Shane

Shane ,

Thank you once more for the clarification.

"each day as the dynamics of the market change I can do another evolutionary iteration and perhaps pick out a new member from the population that is now the superior predictor".

Presumably, the ideal would be a real-time model. Is this feasible or would it take enormous computing power? The shorter the forecasting period period, the less variation and the more accurate the forecast?

"would any of you perhaps be interested in taking a look at the output of my system?". That's simple - just post your predictions. I would advise letting anyone see/know your system; they may nick it.

Good luck,

Grant.
 
Top