- ago
Someone asked about how to use Math.Net to fit a regression problem. I'm offering this Visual Studio example which fits the equation y= betaA*x^(-1) + betaB*x^(0) + betaC*x^(1) and returns the slope of the latest values of the time series passed into it as an example. If someone wants to develop a ScoreCard metric using this code to return the final slope of the equity curve, be my guest. I would in interested in getting a copy of your source code solution. Please employ ScottPlot to visualize your equity curve fit. You might consider optionally adding a term x^(-2) as well (or I can do that).
CODE:
using WealthLab.Core; //* TimeSeries class here *// using MathNet.Numerics; //* GoodnessOfFit class here *// using MathNet.Numerics.LinearAlgebra; //* Matrix & Vector types here *// using MathNet.Numerics.LinearRegression; //* MultipleRegression class here *// namespace Local.Components {    public static class Regress //Latest-bars regression fitting for Wealth-Lab    {       static double slopeFinalInternal;       static double rSquaredInternal;       //Three-term polynomial fit with reciprocal term       public static TimeSeries ReciprocalFit(TimeSeries yObservedTS, int lastBar = 0, int sampleSize = 10)       {          return ReciprocalFit(yObservedTS, out slopeFinalInternal, out rSquaredInternal, lastBar, sampleSize);       }       public static double SlopeFinal { get { return slopeFinalInternal;} }       public static double RSquared { get { return rSquaredInternal;} }       public static TimeSeries ReciprocalFit(TimeSeries yObservedTS, out double slopeFinal, out double rSquared, int lastBar = 0, int sampleSize = 10)       {          if (sampleSize > yObservedTS.Count-1) sampleSize = yObservedTS.Count-1; //prevent out-of-bounds access          //regression ends at the end of the DataSeries (lastBar==0); otherwise, it ends at lastBar          int barInit = (lastBar==0) ? yObservedTS.Count-sampleSize : lastBar+1-sampleSize;          Matrix<double> X = Matrix<double>.Build.Dense(sampleSize,3);          Vector<double> yObserved = Vector<double>.Build.Dense(sampleSize);          double[] yModel = new double[sampleSize];          double barDbl = barInit; //initialize X parameter input for below loop          for (int yBar=0, bar=barInit; yBar<sampleSize; yBar++, bar++, barDbl++)          {             X.SetRow(yBar, new double[] { 1.0/barDbl, barDbl, 1.0 }); //, barDbl*barDbl             yObserved[yBar] = yObservedTS[bar];          }          Vector<double> p = MultipleRegression.NormalEquations(X, yObserved);          TimeSeries yModelTS = new TimeSeries(yObservedTS.DateTimes);          yModelTS.Description = "Polynomial fit w/reciprocal term";          yModelTS.FirstValidIndex = barInit;          for (int yBar = 0, bar = barInit; yBar<sampleSize; yBar++, bar++)             yModel[yBar] = yModelTS[bar] = p[0]*X[yBar,0] + p[1]*X[yBar,1] + p[2];          rSquared = GoodnessOfFit.RSquared(yModel,yObserved);          slopeFinal = yModel[yModel.Length-1] - yModel[yModel.Length-2];          return yModelTS;       } }
Happy computing to you.
1
1,367
6 Replies

Reply

Bookmark

Sort
Glitch8
 ( 10.62% )
- ago
#1
I just had an idea, we recently talked about adding some new Performance Metric(s) to give us the slope of the equity curve. Now, I'm going to add a simple linear regression of the equity curve. But I see there is room for more advanced metrics around this, including the polynomial calculations you described.

We just updated the ScoreCard API document:

https://www.wealth-lab.com/Support/ExtensionApi/ScoreCard

Would you be willing to create a ScoreCard containing one or more advanced metrics based on your code? We can then publish this as part of our WL8 Extension Demo project. It's always good to have more source code example of WL8 components!

https://github.com/LucidDion/WL8ExtensionDemos
0
- ago
#2
QUOTE:
adding some new Performance Metric(s) to give us the slope of the equity curve. ... add a simple linear regression of the equity curve.

It's already possible to do a simple linear regression of the equity curve with Dr Koch's finantic.ScoreCard extension. Below is a direct quote from his email.
QUOTE:
The "LinearRegression" function available in the expressions of FormulaScoreCard is a special library function within the finantic.ScoreCard Library. It is crafted to be useful in the restricted environment of a metrics formula. Therefore it returns a Tuple<> of 5 doubles:

CODE:
// y = slope*x + intercept // public static Tuple<double, double, double, double, double> LinearRegression(IEnumerable<double> yvalues) // returns Tuple<intercept, slope, r, stderr, stderr(slope)> // You can see an example usage in the Lars' K-Ratio metric: var r = LinearRegression(EquityCurve); double slope = r.Item2; double se_slope = r.Item5; double rawK = slope / se_slope; double K = rawK * Sqrt(252) / EquityCurve.Count; return K;
I haven't been able to find the K-Ratio example in the finantic.ScoreCard distribution, but he's giving us the K-Ratio solution above, so we are good.

QUOTE:
We just updated the ScoreCard API document.
Thanks for doing that.

QUOTE:
Would you be willing to create a ScoreCard containing one or more advanced metrics based on your code?
That's been on my list of things to do for 10 months now. But is it going to require a knowledge of WPF? Learning WPF has also been on my list, but there are three very long chapters on WPF in my C# textbook I have yet to read. And after doing so, I'll need a simple problem like installing the ScottPlot WPF object with the WL extension demo to get rolling with WPF first.

If I can define a ScoreCard metric without knowing WPF, I can get started with it sooner; otherwise, we are looking at 6 months out.

---
I have thought some about the equity curve fitting problem (for a ScoreCard metric). To fit the equity curve rigorously, you should probably be fitting a mixed model (both regression fit and difference equation fit). However, although the difference equation fit (i.e. ARMA model fitting of a time series) would be good at capturing the particulars of the equity curve behavior, that's not really our goal here. Our immediate goal with a ScoreCard merit metric is a "generalized fit" to capture the overall equity curve behavior, and a regression fit alone may be good enough for that; at least for version 1.0 of this attempt.

And the real goal is to get the literature talking (say about the shortcomings of our limited approach). And I think a regression fit will be good enough to do that.
0
Glitch8
 ( 10.62% )
- ago
#3
No, in fact the WealtbLab.Core library is platform neutral and doesn’t even reference WPF. It’s just plain old .NET.
0
- ago
#4
QUOTE:
[ScoreCard code] doesn’t ... reference WPF. It’s just plain old .NET.

I may look at it sooner then. But there's still the issue of seeing how the polynomial regression fit looks with ScottPlot to evaluate how it's working. The R-Squared doesn't tell you everything about the fit. But I can ScottPlot the fit to a image file for debugging.

By the way, WL7 has a K-Ratio ScoreCard metric listed in its Extended ScoreCard offering, so you do have "limited" simple regression fitting available now. But I remember checking this out on WL6 two years ago and I didn't see the K-Ratio as being that helpful.

Today, I "manually eyeball" the equality curve on marginally performing stocks and remove them if it's erratic or negatively sloping. If I had a meaningful ScoreCard metric, I wouldn't have to be doing that.
0
- ago
#5
As a think deeper about this problem there are some issues (loose ends). For one, the polynomial fit is going to return two numbers: the latest slope and an R-squared. Both are important and have very different meanings (are orthogonal). Yes, it would be nice to combine them somehow into a single merit metric, but some kind of visualization for doing this maybe useful. The classical approach would be to take the 2-norm (or Euclidean norm) of these two numbers. But that's not the only option.

You're actually familiar with the Euclidean norm without knowing it. You probably have used an AC volt meter before. The AC voltage is the root-mean-square (RMS) of voltage integrated over time. You're basically taking the Euclidean norm of the voltage integrated over time. If you want to measure power [power=RMSwatts=RMS(amps*volts)=sqrt(amps^2+volts^2)], that's the Euclidean norm (RMS) of the current (in amperes) and voltage integrated over time. We square the current and voltage, add these squares together, then take their square root--the Euclidean norm--and average it over time. That gives us RMS power in watts.

I would think about how to add a ScoreCard visualization that would combine the slope and R-squared somehow. The Euclidean norm would be one combining option. The user might want to select other combining options based on the visualization as well.

UPDATE: Another approach would be to let ScoreCard metrics return a Tuple<> instead of a scaler. It would then be the job of extensions like finantic.ScoreCard to combine these Tuples into a single merit metric. But even finantic.ScoreCard falls short of employing a visualization to help assist in building these merit metrics, which is really a numerical research problem that needs to be addressed on a full blown statistical analysis package. finantic.ScoreCard is simply a stop gap measure to a complex research problem.
0
- ago
#6
I've decided to shelf this project until I can find a partner to help me with the front-end WPF side of it. What I need from the partner is two child windows. The data grid one can be associated with the WL extension we build, and the second, floating, child window needs to support ScottPlot. Also, when a row in the data grid is clicked on, it needs to trigger a callback routine I will use to perform a ScottPlot of those row statistics.

I will handle all the numerical analysis and statistics part of this, which I'm proficient at. But the partner needs to handle the front-end WPF side, which I'm unable to do.

There's no need for creating ScoreCard objects for this job. We're not doing that.

For my own purposes, I'm going write all the relevant, equity curve, merit metrics to a disk file, which can be imported by R. That should meet my own statistical and plotting needs.
0

Reply

Bookmark

Sort