Apr

10

Randomness, from Jeff Watson

April 10, 2011 |

 I've been thinking a lot about randomness lately. Trying to define randomness, I presume that it can only be defined negatively, as in the absence of any discernible or systematic patterns. I believe that complete randomness can only be disproved and not proven; but a test will only detect a single pattern or a group of related patterns. I would appreciate any thoughts on randomness in a philosophical vein as there might be a few meals lying right under our noses.

Gary Rogan writes:

Just some random thoughts on the subject. Randomness signifies the lack of an informational connection between the process that generates one even and any other event. There are two kinds of connections: the specific knowledge of one process knowing what the other one is doing, and the inherent construction similarity between the processes. Imagine that you need to pick 100 random events. You could pick 100 individuals, put them in separate rooms and let them pick a number each. They will satisfy the lack of the first type of connection, but not the second. Their picks will not be truly random because human beings of any kind have enough similarities to not satisfy the second, yet their picks will be more random than if they were together as a group. So the trick is to find processes that have not connection to each other and no preferences to generate any particular number within the rules of what's acceptable. 

George Parkanyi adds: 

Randomness seems to be overlaid on some kind of order – a basic framework within which seemingly unconnected events then play out to set up our environment and our experiences. Kind of like a board game - a basic set of rules with additional random elements, say dice, shuffled cards and individual decisions that ensure that no two games will ever be played exactly the same way. The game overall works toward a predictable outcome (someone winning), but the means of getting there will never be the same for any two plays. 

Mark Schuetz writes: 

Apologies if Rumsfeld's quote has become hackneyed, but I think it describes one facet of randomness well.

"There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know."

Some always think of randomness as "known unknowns": everything was determined by some underlying process or fits some probability distribution. Depending on one's definition of randomness, perhaps there are more "unknown unknowns" than meets the eye: a truly random event or series of events might not determined by some underlying logical process and a descriptive probability distribution might not exist or might be impossible to know.

Russ Sears adds:

Randomness is a major topic in Abstract Algebra, and studying it almost became my career after grad school. Not sure if I can do it justice now, as I have been away from the subject for so long. However, in sequence of numbers (most events/things can be numbered), if there is no way to discern step t+delta from t even by narrowing its probability down then by most definitions it is random. For practical matters to "create" something that is random it is really a matter of hiding the pattern so that these probability distributions can not be discovered. You do this by the size of the numbers involved. In other words it is deterministic (it really can be discerned by cause and events ) but the numbers involved make it impossible to do so either because the measurement of the determining factors are impossible to categorized with enough accuracy to determine (think lottery ball drawings or weather/chaos) or because the "code" is varied and on such a large scale that only those with the "key" can decipher it.


Comments

Name

Email

Website

Speak your mind

Archives

Resources & Links

Search