Feature Request: Better Granular Processing
Granular Processing can enhance the accuracy of a backtest by a big margin.
In another discussion thread it turned out that EOD data is usually adjusted for splits and dividends while intraday data is not.
This causes Granular Processing to fail for all symbols with splits or dividends.
In the same thread a solution was proposed:
Granular processing should take this mismatch into account and adapt intraday data to EOD data, i.e. multiply the intraday series of the current day by (EOD_OPEN) / (Intraday_Open) before doing its processing (searching the correct entry time).
Reply from Cone:
Please vote for this #FeatureRequest if you want to see more accurate backtest results for your Limit Order Strategy.
Granular Processing can enhance the accuracy of a backtest by a big margin.
In another discussion thread it turned out that EOD data is usually adjusted for splits and dividends while intraday data is not.
This causes Granular Processing to fail for all symbols with splits or dividends.
In the same thread a solution was proposed:
Granular processing should take this mismatch into account and adapt intraday data to EOD data, i.e. multiply the intraday series of the current day by (EOD_OPEN) / (Intraday_Open) before doing its processing (searching the correct entry time).
Reply from Cone:
QUOTE:
Good solution!
Although to save cycles I'd change it to, "multiply the intraday price by (EOD_OPEN) / (Intraday_Open) as each bar is processed."
However, if using Wealth-Data, the adjustment should be EOD_CLOSE / Intraday_CLOSE. They won't always be refer to the same price ..., but the comparison will be closer than the Opening trade since Wealth-Data has the Open on the Primary Market, which is frequently different by several percent from the first trade, which may occur on another exchange.
Please vote for this #FeatureRequest if you want to see more accurate backtest results for your Limit Order Strategy.
Rename
We discussed this one in the German User Group meeting yesterday.
That might be the reason for the large number of up-votes...
That might be the reason for the large number of up-votes...
Yes, we discovered that a while ago (not just for this one).
It doesn't feeling like granular processing is the place to be adjusting intraday data.
Shouldn't this happen above that level at the point of collecting the data?
Shouldn't this happen above that level at the point of collecting the data?
The problem is how the EOD is adjusted... or not.
For example, Wealth-Data is adjusted for special dividends. IQFeed data won't be. I could go on with about 12 more permutation of examples including the 3 different ways you can adjust Norgate data.
The idea is to make the best use of what you have. An extra calculation based on lining up the session closing prices will make that happen.
For example, Wealth-Data is adjusted for special dividends. IQFeed data won't be. I could go on with about 12 more permutation of examples including the 3 different ways you can adjust Norgate data.
The idea is to make the best use of what you have. An extra calculation based on lining up the session closing prices will make that happen.
Right. But granular processing is not the place to address any or all of those 12 cases.
I added edits to Post #5
How else could you solve it?
How else could you solve it?
I’d either make the provider return the data with the same kind of adjustments, or if that’s not possible create a new data provider that can return intraday data but adjusted for an event providers splits and/or dividends.
It clearly doesn’t belong in the granular processing logic.
It clearly doesn’t belong in the granular processing logic.
Imagine the confusion and headaches if this was done by granular processing!! Nobody would be able to see or verify the adjusted data. A new provider that can do this, allowing us to see the adjusted data on a chart, is the way.
okay, but fwiw, there would be no confusion and no changing of data if none were required.
It's not a bunch of logic. Here it is:
It's not a bunch of logic. Here it is:
CODE:
/* If Daily, always Filter intraday for Market Hours */ BarHistory intradayBars = WLHost.Instance.GetHistory(bars.Symbol, BacktestSettings.IntradayWeightScale, DateTime.MinValue, DateTime.MaxValue, 0, null); BarHistory intradayDaily = BarHistoryCompressor.ToDaily(intradayBars); intradayDaily = BarHistorySynchronizer.Synchronize(intradayDaily, bars); TimeSeries ifactor = bars.Close / intradayDaily.Close << 1; ifactor = TimeSeriesSynchronizer.Synchronize(ifactor, intradayBars) >> 1; // use this data for granular TimeSeries intradayHighs = intradayBars.High * ifactor; TimeSeries intradayLows = intradayBars.Low * ifactor;
That’s not too bad, let’s go ahead and try it out!
Let's not hard code 1Minute though, use whatever granular setting is established in the settings.
It was just to represent the intraday bars that were available.
(Added a correction above.)
(Added a correction above.)
Edited to remove the hard coded 1-minute scale.
Ready for test :)
Your Response
Post
Edit Post
Login is required