### Mar

#### 14

# Subprime Information and Real Estate, from Gordon Haave

March 14, 2007 | 3 Comments

Why is it whenever the government decides to protect us from market forces, we tax payers get the shaft? Would it not be better just not to interfere in the first place?

Financially it would be better to let the foreclosures happen. Not foreclosing on bad debts means the economy doesn't get to reallocate capital to its best uses. That, in short, was the cause of the 10-year recession in Japan.

If it is in the interest of the lenders to give people breathing room, they will do so without the government forcing them to.

## Rich Ghazarian adds:

Not long ago, I was involved in building predictive models for sub-prime products for one of the major shorts in the market today. There was no way of predicting today's scenario, because a large part of the poor credit performance is due to fraudulent mortgages (loans originated on falsified information). Thus, most of the models are based on false historical data. For instance, a borrower with a Debt to Income Ratio (DTI) of 0.4 is now all of a sudden a borrower with a DTI of 1.4 … oops! It is interesting that this fraud was mostly conducted by "Loan Officers" and not the borrowers. Here is an example of quant models being useless!

### Mar

#### 13

# Non-Linear Relationships, from Philip McDonnell

March 13, 2007 | Leave a Comment

I would like to offer some simple thoughts on non-linear relationships. The usual way to study non-linear correlations is to transform one or more of the variables in question. For example if we have a reason to believe that the underlying process is multiplicative then we can use a log function to model our data. When we do a correlation or regression of y~x we can just take the transformed variables ln(y)~ln(x) as our new data set. We are still doing a linear correlation or a linear regression but now we are doing it on the transformed variables.

Ideally we would know the form of the non-linear relationship from some theory. Absent that we could use a general functional form such as the polynomials. So our transform could be something like X^2, X^3, or X^4. Using one of these terms is usually pretty safe. But combining them in a multiple regression can be problematic. The reason is that the terms x^2 and x^3 are about 67% correlated. Using highly correlated variables to model or predict some third variable is a bad idea because you cannot trust the statistics you get.

One way around that is to use orthogonal polynomials or functions. We have previously discussed Fourier transforms and Chebychev polynomials. Both of these classes are orthogonal which also means that we can fit a few terms and add or delete terms at will. The fitted coefficients will not change if we truncate or add to the series. Each term is guaranteed to be linearly independent of the others.

## Bruno Ombreux asks:

Using one of these terms is usually pretty safe. But combining them in a multiple regression can be problematic. The reason is that the terms x^2 and x^3 are about 67% correlated. Using highly correlated variables to model or predict some third variable is a bad idea because you cannot trust the statistics you get.

I have a question.

One of the reasons for adding regressors is to take into account all possible reasons behind a move in the variable we are trying to explain. However, multicollinearity being prevalent in finance, it is a source of headaches.

If we could randomize and/or design experience plans for empirical studies, as we do in biology, we could get rid of part of the problem.

Is it possible to randomize ex post? Let's say I what to study Y = aX+ b + e. If instead of taking the full history of observed (Y,X), I am taking a random sample of (Y,X), it creates some kind of post-randomization, which should reduces the impact of other factors.

Does it make sense? Of course, we would lose all the information contained in the non-sampled (Y,X). That means even less data to work with, which is not nice with ever-changing cycles.

Are there books about this type of technique? I have never heard about it so maybe it doesn't exist.

## Rich Ghazarian mentions:

And of course if you want a more powerful model, you fit a Copula to your processes and now you are in a more realistic Dependence Structure. Engle has a nice paper on Dynamic Conditional Correlation that may interest Dependence modelers on the list. The use of Excel correlation, pearson correlation, linear correlation … these must be the biggest flaws in quant finance today.

## Jeremy Smith adds:

With linear functions we can compute the Eigenvectors to get an orthogonal representation. One problem that gets in the way of nonlinear models is that it isn't clear what is the appropriate "distance" measurement. You need a formal metric of distance to model, compare, or optimize anything. How far apart are these points?

With linear axes, distance is determined by Pythagoras. But what is suggested for the underlying measure of distance if the axes aren't linear?

These remarks about correlation resonate with me, especially in the case of the stock market.

## From Vincent Andres:

If you did replace your original axis X and Y by new axis X'=fx(X) and Y'=fy(Y) this is a transformation of the kind P=(x,y) -> P'=f(P)=(x',y')=(fx(x), fy(y)).

This transformation can be reverted without worry. P'=(x',y') -> P=(x,y) where x and y are the antecedents of x' and y' thru the reciprocal functions fx^-1 and fy^-1.

A "natural" suggested distance measure in this new universe is thus : dist(P1, P2) = dist(ant(P1), ant(P2)) ant = antecedent.

This works for all functions fx and fy being monotonous, e.g., (ln(x), x^2, etc) because there is a strict bijection between the two universes. It could even do something for a more large class of functions.

Sorry for the difficult notations, but I hope the idea is clear.

## Archives

- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015 2084
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008 135d
- August 2008
- July 2008
- June 2008
- May 2008
- April 2008
- March 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- Older Archives

## Resources & Links

- The Letters Prize
- Pre-2007 Victor Niederhoffer Posts
- Vic’s NYC Junto
- Reading List
- Programming in 60 Seconds
- The Objectivist Center
- Foundation for Economic Education
- Tigerchess
- Dick Sears' G.T. Index
- Pre-2007 Daily Speculations
- Laurel & Vics' Worldly Investor Articles