Apr

24

I found the paper, Stochastic Hydrology, lecture 24 to be the best discussion of extreme values very useful in finance I've found. It uses the time between successive maximum events as a base to calculate the probability that a maximum will be exceeded by simple and relatively understandable and useful things that a person can calculate and count with pencil and paper. It ends a rather time consuming search for such a method.


Comments

Name

Email

Website

Speak your mind

1 Comment so far

  1. douglas roberts dimick on April 28, 2013 10:07 am

    How Stochastic Hydraulic States Fail to Indicate The Quantitative Relativity of Price Action

    An implied methodology utilizing stochastic hydraulic state analysis to determine probabilities relative to price action is problematic because of translational barriers to correlate rules-based assumptions from mean and standard deviation (output) of open system processing into closed loop systematics.

    “Stochastic” comes from the Greek word στόχος, which means “aim”. It also denotes a target stick; the pattern of arrows around a target stick stuck in a hillside is representative of what is stochastic.

    The issue herewith then revolves around that time denomination, being present to future. From a standpoint of risk management, such unknown events (for example, predicting price action) are synonymous with a stochastic system that generates a state the is non-deterministic, because a subsequent state is determined both by the system’s predictable actions and by a random element.

    Interesting that Victor’s cited lecture is preceded by one about Markov chains…http://nptel.iitm.ac.in/courses/105108079/module6/lecture24.pdf… to wit:

    Given that a DTMC “is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states,” note also it is a “random process usually characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it… [a] specific kind of “memorylessness… [that] have many applications as statistical models of real-world processes.”

    The point being here that the random process as articulated requires an open system. However, my research indicates that closed systems are required to efficiently measure input-output-state indicators that otherwise are infrequent or random. This phenomenon is best witnessed during a given system’s state change.

    When it comes to understanding the significance of closed loop processing to minimize risk of stochastic modeling, my favorite is Lecture-8 State Machine Design, offered in an education series funded by the Government of India. See… http://www.youtube.com/watch?v=hg2QxXeI_-8

    Here the lecturer notes that a state is “a necessary evil” to achieve an outcome; “it is a means to achieve.” With any corresponding excitation table, the question then becomes what is the next state from a present state. System design focuses on inputs to be processed and outputs so generated; states concern in between (or state changes and state transitions) that input-output-state processing.

    The synchronous system differs (from asynchronous without clock) as inputs may or may not change the output. See the J-K flip-flop example in the cited Lecture 8 youtube.

    Note how synchronous systems are useful in “noisy situations” where input would otherwise indicate excitation. May market volatility and price data distinctions be so correlated?

    In that hydrologic systems are influenced by extreme events (severe storms, floods, droughts), frequency analysis is relevant. Frequency factors may or may not be comparative given price action channels reportedly 80% of a given market session, often generating such (coordinate) extreme events infrequently – though not necessarily randomly.

    As Victor’s cited article discusses, calculating magnitudes of extreme events requires a cumulative probability distribution function to be invertible. As some probability distribution functions are not readily invertible, alternative methods of calculating magnitudes of extreme events may use frequency factors. For instance, consider the annual maximum discharge of a river for 45 years. How to calculate a frequency factor and obtain the maximum annual discharge value corresponding to a 20 year return period?

    How would mass and time correlations calculating magnitudes of given price action be correlative to this example of river discharge?

    How could frequency analysis relate trending of price action as being analogous to considering hydrologic data, specifically the magnitude of extreme events to their frequency of occurrence using probability distributions?

    The article notes that hydrologic data to be analyzed is assumed to be independent and identically distributed; the hydrologic system is assumed to be stochastic, space-independent and time independent. Can the same assumptions be determined for analysis of price action?

    The article continues that, for frequency analysis, data should be properly selected so that the assumptions of independence and identical distributions are satisfied. What rules-based determinations govern such data selections as a given model is applied from one security to another as well as market to market?

    The article also holds that “assumption of identical distribution or the homogeneity is achieved by selecting the observations from the same population (i.e., no changes in the watershed and recording gauges are made).” At what level of data input-output processing can this distribution be realized?

    It says that the “assumption of independence is achieved by selecting the annual maximum of the variable being analyzed,” wherefrom “successive observations from year to year will be independent.” But what rules-based formulation(s) defines what constitutes an annual maximum?

    Moreover, what interval determination is necessary to correlate periodic outcomes?

    The article distinguishes between complete and partial duration series relative to a predefined base value. Again, what rules-based assumptions underlie a given base value computation and (in an extreme value series) while distinguish between the largest or smallest values occurring relative to intervals of record?

    The questions unanswered here suggest nonlinear distinctions that return us to Victor’s random walk discussions during his career. The point?

    Is determining whether the randomness of financial markets is only so (random) as a corresponding financial model fails to minimize its stochastic-based open loop processing of input-output-state indications?

    Systematic processing suggests that this issue is predicated on understanding how rules-based assumptions are processed when models rely on mathematical analysis such as frequency and magnitude distinctions.

    At the end of this discussion, we are left to ponder how magnitude of directional price action could be considered (as an extreme event) inversely proportional to its frequency of occurrence. Thus, at this juncture, one faces an issue of topology, for magnitude distinguishes what is a rules-based distinction between extreme and normative.

    dr

Archives

Resources & Links

Search