### Apr

#### 24

# Extreme Values, from Victor Niederhoffer

April 24, 2013 |

I found the paper, Stochastic Hydrology, lecture 24 to be the best discussion of extreme values very useful in finance I've found. It uses the time between successive maximum events as a base to calculate the probability that a maximum will be exceeded by simple and relatively understandable and useful things that a person can calculate and count with pencil and paper. It ends a rather time consuming search for such a method.

# Comments

1 Comment so far

## Archives

- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
- June 2010
- May 2010
- April 2010
- March 2010
- February 2010
- January 2010
- December 2009
- November 2009
- October 2009
- September 2009
- August 2009
- July 2009
- June 2009
- May 2009
- April 2009
- March 2009
- February 2009
- January 2009
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- May 2008
- April 2008
- March 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- Older Archives

## Resources & Links

- The Letters Prize
- Pre-2007 Victor Niederhoffer Posts
- Vic’s NYC Junto
- Reading List
- Programming in 60 Seconds
- The Objectivist Center
- Foundation for Economic Education
- Tigerchess
- Dick Sears' G.T. Index
- Pre-2007 Daily Speculations
- Laurel & Vics' Worldly Investor Articles

How Stochastic Hydraulic States Fail to Indicate The Quantitative Relativity of Price Action

An implied methodology utilizing stochastic hydraulic state analysis to determine probabilities relative to price action is problematic because of translational barriers to correlate rules-based assumptions from mean and standard deviation (output) of open system processing into closed loop systematics.

“Stochastic” comes from the Greek word στόχος, which means “aim”. It also denotes a target stick; the pattern of arrows around a target stick stuck in a hillside is representative of what is stochastic.

The issue herewith then revolves around that time denomination, being present to future. From a standpoint of risk management, such unknown events (for example, predicting price action) are synonymous with a stochastic system that generates a state the is non-deterministic, because a subsequent state is determined both by the system’s predictable actions and by a random element.

Interesting that Victor’s cited lecture is preceded by one about Markov chains…http://nptel.iitm.ac.in/courses/105108079/module6/lecture24.pdf… to wit:

Given that a DTMC “is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states,” note also it is a “random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it… [a] specific kind of “memorylessness… [that] have many applications as statistical models of real-world processes.”

The point being here that the random process as articulated requires an open system. However, my research indicates that closed systems are required to efficiently measure input-output-state indicators that otherwise are infrequent or random. This phenomenon is best witnessed during a given system’s state change.

When it comes to understanding the significance of closed loop processing to minimize risk of stochastic modeling, my favorite is Lecture-8 State Machine Design, offered in an education series funded by the Government of India. See… http://www.youtube.com/watch?v=hg2QxXeI_-8

Here the lecturer notes that a state is “a necessary evil” to achieve an outcome; “it is a means to achieve.” With any corresponding excitation table, the question then becomes what is the next state from a present state. System design focuses on inputs to be processed and outputs so generated; states concern in between (or state changes and state transitions) that input-output-state processing.

The synchronous system differs (from asynchronous without clock) as inputs may or may not change the output. See the J-K flip-flop example in the cited Lecture 8 youtube.

Note how synchronous systems are useful in “noisy situations” where input would otherwise indicate excitation. May market volatility and price data distinctions be so correlated?

In that hydrologic systems are influenced by extreme events (severe storms, floods, droughts), frequency analysis is relevant. Frequency factors may or may not be comparative given price action channels reportedly 80% of a given market session, often generating such (coordinate) extreme events infrequently – though not necessarily randomly.

As Victor’s cited article discusses, calculating magnitudes of extreme events requires a cumulative probability distribution function to be invertible. As some probability distribution functions are not readily invertible, alternative methods of calculating magnitudes of extreme events may use frequency factors. For instance, consider the annual maximum discharge of a river for 45 years. How to calculate a frequency factor and obtain the maximum annual discharge value corresponding to a 20 year return period?

How would mass and time correlations calculating magnitudes of given price action be correlative to this example of river discharge?

How could frequency analysis relate trending of price action as being analogous to considering hydrologic data, specifically the magnitude of extreme events to their frequency of occurrence using probability distributions?

The article notes that hydrologic data to be analyzed is assumed to be independent and identically distributed; the hydrologic system is assumed to be stochastic, space-independent and time independent. Can the same assumptions be determined for analysis of price action?

The article continues that, for frequency analysis, data should be properly selected so that the assumptions of independence and identical distributions are satisfied. What rules-based determinations govern such data selections as a given model is applied from one security to another as well as market to market?

The article also holds that “assumption of identical distribution or the homogeneity is achieved by selecting the observations from the same population (i.e., no changes in the watershed and recording gauges are made).” At what level of data input-output processing can this distribution be realized?

It says that the “assumption of independence is achieved by selecting the annual maximum of the variable being analyzed,” wherefrom “successive observations from year to year will be independent.” But what rules-based formulation(s) defines what constitutes an annual maximum?

Moreover, what interval determination is necessary to correlate periodic outcomes?

The article distinguishes between complete and partial duration series relative to a predefined base value. Again, what rules-based assumptions underlie a given base value computation and (in an extreme value series) while distinguish between the largest or smallest values occurring relative to intervals of record?

The questions unanswered here suggest nonlinear distinctions that return us to Victor’s random walk discussions during his career. The point?

Is determining whether the randomness of financial markets is only so (random) as a corresponding financial model fails to minimize its stochastic-based open loop processing of input-output-state indications?

Systematic processing suggests that this issue is predicated on understanding how rules-based assumptions are processed when models rely on mathematical analysis such as frequency and magnitude distinctions.

At the end of this discussion, we are left to ponder how magnitude of directional price action could be considered (as an extreme event) inversely proportional to its frequency of occurrence. Thus, at this juncture, one faces an issue of topology, for magnitude distinguishes what is a rules-based distinction between extreme and normative.

dr