Hello. My knowledge in statistics is limited but could someone tell me, in layman's terms, the different between exhaustive parallel & exhaustive non-paralell optimization? Or is there somewhere you can point me to that would explain what ALL the different optimization methods mean? I see there's about 10 or so different ways to optimize. TY.

Rename

Exhaustive simply means the optimizer tries every possible permutation of all the parameters. If you plan to make a 3D plot of the optimizer's performance, then having a grid to plot from is a good option, but I would only optimize two parameters at a time when doing this; otherwise, you will wait forever.

Many optimizers are multithreaded, so they may try to optimize several symbols in several different cores of your processor chip simultaneously to speed things up. Ideally, whether it's doing this (parallel optimization) or not (serial or non-parallel optimization), the answer should be the same. If it's not, then your optimization isn't thread safe and you need to change your declarations to better isolate thread variables from each other. This "variable scoping" is an advanced topic discussed in https://www.wealth-lab.com/blog/anatomy-of-a-wl7-strategy Let start is new discussion specific to variable scoping to discuss thread safety.

Many optimizers are multithreaded, so they may try to optimize several symbols in several different cores of your processor chip simultaneously to speed things up. Ideally, whether it's doing this (parallel optimization) or not (serial or non-parallel optimization), the answer should be the same. If it's not, then your optimization isn't thread safe and you need to change your declarations to better isolate thread variables from each other. This "variable scoping" is an advanced topic discussed in https://www.wealth-lab.com/blog/anatomy-of-a-wl7-strategy Let start is new discussion specific to variable scoping to discuss thread safety.

The only different between Exhaustive and Exhaustive Non-Parallel is that the former runs on multiple threads so it can run more efficiently on computer with multiple CPUs. In Build 19 (coming tomorrow) it's even been streamlined to run faster and consume less memory.

If you can get a hold of the November 2022 issue of Stocks and Commodities magazine, DrKoch has an article there talking about this exact topic and featuring WealthLab!

If you can get a hold of the November 2022 issue of Stocks and Commodities magazine, DrKoch has an article there talking about this exact topic and featuring WealthLab!

QUOTE:

there's about 10 or so different ways to optimize

Optimization is a complex topic. There is quite some research about optimization algorithms. The goal is always to

1.) find a good parameter combination with as few backtest runs as possible

2.) find a parameter combination that works reasonably Out-of-Sample

In the literature there exist many approaches to solve this task.

WL comes with 3 native algorithms. (Random, Grid and Shrinking Window)

The finantic.Optimizer Extension (https://www.wealth-lab.com/extension/detail/finantic.Optimizer) adds 4 new algorithms and 2 variations of Random and Grid.

It turned out that especially the SMAC optimizer does a very good job in finding a good parameter combination with just a few backtest runs (100-200) if there are more than 2 parameters.

As Glitch mentioned above there is an series of articles in TASC magazine (http://traders.com/) about the optimizers of WL and various tricks to perform good optimizations.

**Bottom Line**

The various algorithms differ in the amount of time they need to find good parameter combinations.

They also differ in quality of the combinations found.

Or to put it boldly: Use the best algorithm you can get to improve your strategy otherwise you'll waste a lot of time.

See also discussion thread "Choosing the best Strategy optimizer": https://www.wealth-lab.com/Discussion/Choosing-the-best-Strategy-optimizer-6707

The problem is that we have to wait 3 months to read the whole article...

Is there any way to share it now?

Is there any way to share it now?

The first part is a whole article and is available now. There will be two other whole articles, parts 2 and 3, but they aren't ready yet, and no DrKoch won't be able to share them here because they're property of TASC magazine.

The above link in Reply# 4 describes the

__most salient__information relevant to the question in this thread about comparing optimization algorithms. That's what you want most. Articles in TASC may add some details, but the above link has the meat.
Besides all the valid points and different ways to use optimization, you should also consider using WFO (Walk-Forward-Optimization) in order to receive a more realistic result on what you can expect in future results. I also always suggest the Expanding Optimization window.

I am looking for this part

Your Response
Post

Edit Post

Login is required