Hi,
The data shown below is mostly spaced at 5 seconds. And sometimes 10 seconds.
This is Y! data for Oct 14, 2024. There are supposed to be 12 x 390 = 4,680 data points.
But there are only 3,535. So there are many 10 second spaces/
The calculated scale is 1 tick.
Question 1) How can I force it to be 5 seconds?
Question 2) What is meant by 1 tick?
When I set the Strategy Settings to 1 tick, it correctly finds this data set.
So I have a solution but here is the problem with this:
I will be running a 1 min strategy along with a 5 sec strategy.
If both data sets are calculated to be 1 tick then how will strategies find their own 1 tick data sets?
Thanks!
The data shown below is mostly spaced at 5 seconds. And sometimes 10 seconds.
This is Y! data for Oct 14, 2024. There are supposed to be 12 x 390 = 4,680 data points.
But there are only 3,535. So there are many 10 second spaces/
The calculated scale is 1 tick.
Question 1) How can I force it to be 5 seconds?
Question 2) What is meant by 1 tick?
When I set the Strategy Settings to 1 tick, it correctly finds this data set.
So I have a solution but here is the problem with this:
I will be running a 1 min strategy along with a 5 sec strategy.
If both data sets are calculated to be 1 tick then how will strategies find their own 1 tick data sets?
Thanks!
Rename
I must add this 1 tick for both 1 min and 5 sec is not a common occurrence.
So far, all 1 minute data sets are correctly calculated to be 1 min scale.
So far, all 1 minute data sets are correctly calculated to be 1 min scale.
We’d have to take a look at the data file, could you email it to support@wealth-lab.com?
Your data file included the following records with the same DateTime(s) as the record that preceeded them., so we have no choice than to call it a Tick source.
You can use this code (which gave the output above), to filter out records that are less than 4 seconds from the previous record. By default it looks for the data file in the WL Data folder, and writes new "filtered" corrected file there.
CODE:
^SPX 10/14/2024 11:07:11 5846.31 0 ^SPX 10/14/2024 11:07:11 5846.31 0 ^SPX 10/14/2024 11:07:11 5846.31 0 ^SPX 10/14/2024 11:07:11 5846.31 0 ^SPX 10/14/2024 11:07:11 5846.31 0 ^SPX 10/14/2024 14:06:51 5856.25 0 ^SPX 10/14/2024 14:06:56 5856.26 0 ^SPX 10/14/2024 14:08:46 5859.72 0 ^SPX 10/14/2024 14:08:51 5859.2 0 ^SPX 10/14/2024 14:09:11 5858.29 0 ^SPX 10/14/2024 14:11:21 5857.56 0 ^SPX 10/14/2024 14:11:36 5857.55 0 ^SPX 10/14/2024 14:14:06 5858.4 0
You can use this code (which gave the output above), to filter out records that are less than 4 seconds from the previous record. By default it looks for the data file in the WL Data folder, and writes new "filtered" corrected file there.
CODE:
using WealthLab.Backtest; using System; using WealthLab.Core; using WealthLab.Indicators; using System.Collections.Generic; using System.IO; namespace WealthScript4 { public class RemoveDuplicateDates : UserStrategyBase { public override void Initialize(BarHistory bars) { } public override void BacktestBegin() { string symbol = "^SPX"; string fmt = "M/d/yyyy HH:mm:ss"; List<string> filtered = []; DateTime dte = DateTime.MinValue; string[] lines = File.ReadAllLines(Path.Combine(WLHost.Instance.DataFolder, $"{symbol}.txt")); DateTime lastDte = DateTime.MinValue; foreach (string line in lines) { if (line.Contains("Date") || line.Contains("Close")) // skip a header continue; string[] tokens = line.Split('\t'); if (tokens.Length < 3) continue; string tokendte = $"{tokens[1]} {tokens[2]}"; // if more than 4 seconds from the last date, add it to the filtered list, otherwise ignore this line if (DateTime.TryParse(tokendte, out dte)) { if ((dte - lastDte).TotalSeconds > 4) { filtered.Add(line); lastDte = dte; } else { // records with a DateTime less than 4 seconds from the previous record WriteToDebugLog($"{line}"); } } } // write the filtered list to a new file File.WriteAllLines(Path.Combine(WLHost.Instance.DataFolder, $"{symbol}-filtered.txt"), filtered); } public override void Execute(BarHistory bars, int idx) { } } }
Your Response
Post
Edit Post
Login is required