It looks like that the data manager has problems to show detailed information when large data is loaded. I am talking about 150 Million lines of data for about 100 symbols.
Of course i am not able to say that the size is the reason. It might be a problem with the data itself. At this point i ask the WL team to verify if this amount of data can be handled by the Data-Manager's view.
Of course i am not able to say that the size is the reason. It might be a problem with the data itself. At this point i ask the WL team to verify if this amount of data can be handled by the Data-Manager's view.
Rename
WL does not impose any hard coded limitations, so any limitations encountered will be "practical" limitations. I personally don't have data containing 150 million lines for 100 symbols (at least not at the moment) so I can't comment on this setup, but my feeling is that you're reaching a practical limitation with this amount of data.
Hi Glitch.
I think to produce pseudo data with 150 million lines should't be a problem.
The OHLC data can be constant (or randomized to some degree). I am not sure about the date.
But to write a loop with a date increment and constant OHLC data should work to get data for the debug scenario.
This way, you might check on your side how the loading operation behaves.
I think to produce pseudo data with 150 million lines should't be a problem.
The OHLC data can be constant (or randomized to some degree). I am not sure about the date.
But to write a loop with a date increment and constant OHLC data should work to get data for the debug scenario.
This way, you might check on your side how the loading operation behaves.
OK I'll generate some data for testing then, thanks.
Your Response
Post
Edit Post
Login is required