Single time series statistical analysis or 2 variable correlations are easy to compute with limited data and standard formula and normal assumptions. However in real life situations there are obviously more than two variables. In practice everyone considers multiple variables. I've read and used Kendall's Rank Correlation methods, but am not sure how they work exactly. This seems like a way to weigh multiple factors or variables. Multiple things affect the price of a future contract. Real time data is available for a number of things that should affect price, other than just prior prices. Isn't there a way to factor more than one of these things in on a quantitative basis? Do them one at a time but simultaneously, and average it?

One way game theory way to make decisions is to have two columns, plus's and negatives. A factor can be added to each as to its weight or importance to the participants or its possibility or probability. Simply weighing the two sides can help with a decision and it can be quantified.

Steve Ellison writes: 

I sometimes use a machine learning technique called gradient descent to get an idea of which of many factors is more predictive. The (greatly oversimplified) general idea is to start with a default weighting parameter on each input, then make a prediction and compute a "cost function" (error) on the difference between the actual result and the prediction. As it sequentially processes each occurrence, the algorithm adjusts each weighting parameter up or down to try to reduce the cost function. The factors that end up with weights farther from zero (in either direction) appear to be more predictive. This technique helps me narrow down the possibilities, and I can then pick a few factors to evaluate using simple linear regression.

Mr. Isomorphisms writes: 

Linear algebra is the study of multiple variables interacting at once. However it's a difficult subject which requires multiple go's to really understand intuitively.

http://linear.axler.net is something I'm looking at these days. MIT OCW has a good first course.

Gram Zeppi made some interesting comments on Quora: the Jordan decomposition is the main result. (Or maybe it's factorisations.) Based on pure formalism and symbols, one can rearrange the interacting factors into smaller independent matrices which compose together as a sequence of simpler transforms rather than one big transform, but achieving the exact same result.

I still don't really understand how linear algebra is used in time series. One of the standard transformations is to swap dimensions. But this should be strictly disallowed in time series. (Differencing and then doing linear algebra would be a little better in this regard. Then, like with most choices of window, one implicitly assumes that the "pieces" all come from a statistical population.)


WordPress database error: [Table './dailyspeculations_com_@002d_dailywordpress/wp_comments' is marked as crashed and last (automatic?) repair failed]
SELECT * FROM wp_comments WHERE comment_post_ID = '10680' AND comment_approved = '1' ORDER BY comment_date




Speak your mind


Resources & Links