Is this knowledge? Randomness versus causality How safe is the 'urn of nature' model?Footnotes
https://manyworlds784.blogspot.com/p/footnotes.html
Is this knowledge?
"Science cannot demonstrate that a cataclysm will not engulf the universe tomorrow, but it can prove that past experience, so far from providing a shred of evidence in favour of any such occurrence, does, even in the light our ignorance of any necessity in the sequence of our perceptions, give an overwhelming probability against such a cataclysm." -- Karl Pearson (41)
I would argue that no real knowledge is gained from such an assessment. Everyone, unless believing some occasional prophet, will perforce act as though such an assessment is correct. What else is there to do? Decision-making is not enhanced. However, we suggest that, even in the 21st century, physics is not so deeply understood as to preclude such an event. Who knows, for example, how dark matter and dark energy really behave at fundamental levels? So there isn't enough known to have effective predictive algorithms on that scale. The scope is too great for the analytical tool. The truth is that the propensities of the system at that level are unknown.
This is essentially the same thought experiment posed by Hume, and "answered" by Laplace. Laplace's rule is derived via the application of the continuous form of Bayes's theorem, based on the assumption of a uniform probability distribution. That is, all events are construed to have equal probability, based on the idea that there is virtually no system information (propensity), so that all we have to go on is equal ignorance. In effect, one is finding the probability of a probability with the idea that the possible events are contained in nature's urn. With the urn picture in mind, one then is trying to obtain the probability of a specific proportion. (More on this later.)
S.L. Zabell on the rule of succession
http://www.ece.uvic.ca/~bctill/papers/mocap/Zabell_1989.pdf
To recapitulate, in Pearson's day, as in ours, there is insufficient information to plug in initial values to warrant assigning a probability -- though we can grant that Laplace's rule of succession may have some merit as an inductive process, in which that process is meant to arrive at the "most realistic" initial system information (translated as an a priori propensity).
Randomness versus causality
Consider, says Jevons, a deck of cards dealt out in numerical order (with suits ranked numerically). We immediately suspect nonrandomness. I note that we should suspect nonrandomness for any specific order posited in advance. However, in the case of progressive arithmetic order, we realize that this is a common choice of order among humans. If the deck proceeded to produce four 13-card straight flushes in a row, one would surely suspect design.
But why? This follows from the fact that the number of "combinations" is far higher than the number of "interesting" orderings.
Here again we have an example of the psychological basis of probability theory. If we took any order of cards and asked the question, what is the probability it will be dealt that way, we get the same result: (52!)-1 =~ (8*1067)-1.
Now suppose we didn't "call" that outcome in advance, but just happened upon a deck that had been recently dealt out after good shuffling. What is our basis for suspecting that the result implies a nonrandom activity, such as a card sharp's maneuvering, is at work? In this case a nonparametric test, such as a runs test or sign test is strongly indicative.
Jevons had no nonparametric test at hand (unless one considers Laplace's rule to be such) but even so argued that if one sees a deck dealt out in arithmetical order, then one is entitled to reason that chance did not produce it. This is a simple example of the inverse method.
Jevons points out that, whether math is used or not, scientists tend to reason by inverse probability. He cites the example of the then recently noted flint flakes, many found with evidence of more than one stroke of one stone against another. Without resort to statistical analysis, one can see why scientists would conclude that the flakes were the product of human endeavor.
In fact, we might note that the usual corpus of standard statistical models does indeed aim to sift out an inverse probability of sorts in a great many cases, notwithstanding the dispute with the Bayesian front.
The following examples of the inverse method given by Jevons (42) are of the sort that Keynes disdained:
Go to Chapter 7 HERE.
"Science cannot demonstrate that a cataclysm will not engulf the universe tomorrow, but it can prove that past experience, so far from providing a shred of evidence in favour of any such occurrence, does, even in the light our ignorance of any necessity in the sequence of our perceptions, give an overwhelming probability against such a cataclysm." -- Karl Pearson (41)
I would argue that no real knowledge is gained from such an assessment. Everyone, unless believing some occasional prophet, will perforce act as though such an assessment is correct. What else is there to do? Decision-making is not enhanced. However, we suggest that, even in the 21st century, physics is not so deeply understood as to preclude such an event. Who knows, for example, how dark matter and dark energy really behave at fundamental levels? So there isn't enough known to have effective predictive algorithms on that scale. The scope is too great for the analytical tool. The truth is that the propensities of the system at that level are unknown.
This is essentially the same thought experiment posed by Hume, and "answered" by Laplace. Laplace's rule is derived via the application of the continuous form of Bayes's theorem, based on the assumption of a uniform probability distribution. That is, all events are construed to have equal probability, based on the idea that there is virtually no system information (propensity), so that all we have to go on is equal ignorance. In effect, one is finding the probability of a probability with the idea that the possible events are contained in nature's urn. With the urn picture in mind, one then is trying to obtain the probability of a specific proportion. (More on this later.)
S.L. Zabell on the rule of succession
http://www.ece.uvic.ca/~bctill/papers/mocap/Zabell_1989.pdf
To recapitulate, in Pearson's day, as in ours, there is insufficient information to plug in initial values to warrant assigning a probability -- though we can grant that Laplace's rule of succession may have some merit as an inductive process, in which that process is meant to arrive at the "most realistic" initial system information (translated as an a priori propensity).
Randomness versus causality
Consider, says Jevons, a deck of cards dealt out in numerical order (with suits ranked numerically). We immediately suspect nonrandomness. I note that we should suspect nonrandomness for any specific order posited in advance. However, in the case of progressive arithmetic order, we realize that this is a common choice of order among humans. If the deck proceeded to produce four 13-card straight flushes in a row, one would surely suspect design.
But why? This follows from the fact that the number of "combinations" is far higher than the number of "interesting" orderings.
Here again we have an example of the psychological basis of probability theory. If we took any order of cards and asked the question, what is the probability it will be dealt that way, we get the same result: (52!)-1 =~ (8*1067)-1.
Now suppose we didn't "call" that outcome in advance, but just happened upon a deck that had been recently dealt out after good shuffling. What is our basis for suspecting that the result implies a nonrandom activity, such as a card sharp's maneuvering, is at work? In this case a nonparametric test, such as a runs test or sign test is strongly indicative.
Jevons had no nonparametric test at hand (unless one considers Laplace's rule to be such) but even so argued that if one sees a deck dealt out in arithmetical order, then one is entitled to reason that chance did not produce it. This is a simple example of the inverse method.
Jevons points out that, whether math is used or not, scientists tend to reason by inverse probability. He cites the example of the then recently noted flint flakes, many found with evidence of more than one stroke of one stone against another. Without resort to statistical analysis, one can see why scientists would conclude that the flakes were the product of human endeavor.
In fact, we might note that the usual corpus of standard statistical models does indeed aim to sift out an inverse probability of sorts in a great many cases, notwithstanding the dispute with the Bayesian front.
The following examples of the inverse method given by Jevons (42) are of the sort that Keynes disdained:
1. All larger planets travel in the same direction around the sun; what is the probability that, if a new planet exterior to Neptune's orbit is discovered, it will follow suit? In fact, Pluto, discovered after Jevons's book was published (and since demoted from planetary status), also travels in the same direction around the sun as the major planets.
2. All known noble gases, excepting chlorine, are colorless; what is the probability that, if some new noble gas is discovered, it will be colorless? And here we see the relevance of a system's initial information. Jevons wrote well before the electronic theory of chemistry was worked out. (And obviously, we have run the gamut of stable elements, so the question is irrelevant from a practical standpoint.)
3. Bode's law of distance from the sun to each planet, except Neptune, showed in Jevons's day close agreement with distances calculated using a specific mathematical expression, if the asteroid belt was also included. So, Jevons reasoned that the probability that the next planet beyond Neptune would conform to this law is -- using Laplace's rule -- 10/11. As it happens, Pluto was found to have fairly good agreement with Bode's law. Some experts believe that gravitational pumping from the larger planets has swept out certain oribital zones, leaving harmonic distances favorable to long-lasting orbiting of matter.
The fact that Laplace's rule "worked" in terms of giving a "high" ranking to the plausibility of a Bode distance for the next massive body to be found may be happenstance. That is, it is plausible that a physical resonance effect is at work, which for some reason was violated in Neptune's case. It was then reasonable to conjecture that these resonances are not usually violated and that one may, in this case, assign an expert opinion probability of maybe 80 percent that the next "planet" would have a Bode distance. It then turns out that Laplace's rule also gives a high value: 90.9 percent. But in the first case, the expert is using her knowledge of physics for a "rough" ranking. In the second case, no knowledge of physics is assumed, but a definitive number is given, as if it is somehow implying something more than what the expert has suggested, when in fact it is an even rougher means of ranking than is the expert's.
Now one may argue that probability questions should not be posed in such cases as Jevons mentions. Yet if we remember to regard answers as tools for guidance, they could possibly bear some fruit. However, an astronomer might well scorn such questions because he has a fund of possibly relevant knowledge. And yet, though it is hard to imagine, suppose life or death depends on a correct prediction. Then, if one takes no guidance from a supernatural higher power, one might wish to use a "rationally" objective probability quantity.
Induction, the inverse method and the "urn of nature" may sound old-fashioned, but though the names may change, much of scientific activity proceeds from these concepts.
How safe is the urn of nature model?
Suppose there is an urn holding an unknown but large number of white and black balls in unknown proportion. If one were to draw 20 white balls in a row, "common sense" would tell us that the ratio of black balls to white ones is low (and Laplace's rule would give us a 21/22 probability that the next ball is white). I note that "common sense" here serves as an empirical method. But we have two issues: the assumption is that the balls are well mixed, which is to say the urn composition is presumed homogeneous; if the number of balls is high, homogeneity needn't preclude a cluster of balls that yields 20 white draws.
We need here a quantitative way to measure homogeneity; and this is where modern statistical methods might help, given enough input information. In our scenario, however, the input information is insufficient to justify the assumption of a 0.5 ratio. Still, a runs test is suggestive of non-randomness in the sense of a non-0.5 ratio.
Another issue with respect to induction is an inability to speak with certainty about the future (as in, will the ball drop if I let go of it?). This in fact is known as is "the problem of induction," first raised by Hume.
To summarize some points previously made, induction is a process of generalizing from the observed regularities. Now these regularities may be gauged by applying simple frequencies or by rule of succession reasoning, in which we have only inference or a propensity of the system, or by deductive reasoning, whereby we set up an algorithm that, when applied, accounts for a range of phenomena. Here the propensity is given a nonzero information value. Still, as said, such deductive mechanisms are dependent on some other information -- "primitive" or perhaps axiomatic propensities -- as in the regularities of gravity. Newton and Einstein provide nonzero framework information (propensities) leading to deductions about gravity in specific cases. However, the deductive system of Newton's gravitational equation depends on the axiomatic probability 1 that gravity varies by inverse square of the distance from a ponderable object's center of mass, as has, for non-relativistic magnitudes, been verified extensively with very occasional anomalies ignored as measurement outliers.
Popper in Schism takes note of the typical scientist's "metaphysical faith of the existence of regularities in the world (a faith which I share and without which practical action is hardly conceivable)" (43).
Measures of future uncertainty, such as the Gaussian distribution, "satisfy our ingrained desire to 'simplify' by squeezing into one single number matters that are too rich to be described by it. In addition, they cater to psychological biases and our tendency to understate uncertainty in order to provide an illusion of understanding the world," observed Benoit Mandelbrot and Nassim Taleb.
Some outliers are just too big to handle with a normal curve. For example, if a 300-pound man's weight is added to that of the weights of 100 other persons, he isn't likely to have substantial effect on the mean. But if Bill Gates's net income is added to the incomes of 100 other persons, the mean will be mean-ingless. Similarly, the Fukushima event, using Gaussian methods, was extraordinarily improbable. But the world isn't necessarily as Gaussian as we would like to believe. As said previously, one way to approach the idea of regularity is via pattern recognition matrices. If a sufficient number of entries in two matrices are identical, the two "events" so represented are construed as identical or similar to varying degrees between 1 and 0. But of course, we are immediately brought to the concept of perception, and so we may say that Popper has a metaphysical faith in the mind's reality sensing and construction process of most people, not including some severe schizophrenics. (See Toward.)
Imagine a situation in which regularities are disrupted by sudden jumps in the mind's reality constructor. Life under such conditions might be unbearable and require neurological attention. On the other hand, sometimes "miracles" or "works of wonder," are attested to, implying that some perceive occasional violation of the humdrum of regularities, whether this is a result of a psychosomatic process, wishful/magical thinking, or some sort of intersection with a noumenal world (Part VI, see sidebar).
The theme of "regularities" coincides with what Gott calls the Copernican principle, which I interpret as meaning a metaphysical faith that the rules of nature are everywhere the same (except perhaps in parallel universes).
Gott on the 'Copernican principle'
http://www-psych.stanford.edu/~jbt/224/Gott_93.pdf
It is important to face Hume's point that scientific ideologies of various sorts rest upon unprovable assumptions. For example, the Copernican principle, which Gott interprets as meaning that a human observer occupies no special time or place in the cosmos, is a generalization of the Copernican-Galilean model of the solar system. Interestingly, by the way, the Copernican principle contrasts with the anthropic cosmological principle (discussed later).
Einstein's belief in what must be considered a form of Laplacian realism is put in sharp relief with this assertion:
“The only justification for our concepts and system of concepts is that they serve to represent the complex of our experiences; beyond this they have no legitimacy. I am convinced that the philosophers have had a harmful effect upon the progress of scientific thinking in removing certain fundamental concepts from the domain of empiricism, where they are under our control, to the intangible heights of the a priori. For even if it should appear that the universe of ideas cannot be deduced from experience by logical means, but is, in a sense, a creation of the human mind, without which no science is possible, nevertheless this universe of ideas is just as little independent of the nature of our experiences as clothes are of the form of the human body” (44).
The problem of induction is an obstacle for Einstein. Scientific inquiry requires that it be ignored. However, one might say that this irrational rationalism led to a quagmire that he was unable to see his way past, despite being a friend and colleague of the physicist-turned-logician Kurt Goedel, who had strong reservations about what is sometimes termed Einstein's naive realism.
Another take on this subject is to make a formally valid statement: A implies B, which is to say, "If A holds, then so does B." So if one encounters A as being true or as "the case," then he can be sure that B is also the case. But, at some point in his chain of reasoning, there is no predecessor to A. So then A must be established by induction or accepted as axiomatic (often both) and not by deduction. A is not subject to proof within the system. Of course this is an elementary observation, but those who "believe in Science" need to be reminded that scientific method is subject to certain weaknesses inherent in our plane of existence.
So we tend to say that though theories cannot be proved true, there is a level of confidence that comes with how many sorts of phenomena and how many special cases are successfully predicted by a theory (essentially, in the "hard" sciences, via an algorithm or set of algorithms).
But the fact that some theories are quite successful over a range of phenomena does not mean that they have probabilistically ruled out a noumenal world. It does not follow that successful theories of phenomena (and they are not fully successful) demonstrate that a noumenal world is highly improbable. In fact, David Bohm's struggles with quantum theory led him to argue that the world of phenomena must be supplemented by a noumenal world (he did not use that term) that permits bilocality via an "implicate," or unseen, order.
The noumenal world is reflected by our attempts to contend with randomness. The concept of pure randomness is, I would say, an ideal derived from our formalizations of probability reasoning. Consider a notionally constructible binary string, in which each unit is selected by a quantum detector. For example, if the algorithm clock is set for 1 second intervals. A detection of a cosmic ray in that period is recorded as a 1, whereas no detection during the interval receives a 0.
If this algorithm runs to infinity, we have a probability 1 of every possible finite substring appearing an infinity of times (ignoring the presumed change in cosmic ray distribution tens of billions of years hence). This follows from (1-p)n = 1 as n goes infinite. So, for example, Fred Hoyle noticed that if the cosmos is infinite in size, we would expect an infinity of Fred Hoyles spread across the cosmos.
But, unless you are quite unusual, such a possibility doesn't accord with your concept of reality, does it? You have an inner sense here that we are "playing with numbers." And yet, in the Many Worlds interpretation of quantum physics, there is either an infinite or some monstrously large set of cosmoses in which versions of Fred Hoyle are found many times over -- and remarkably, this multiplicity scenario was developed in order to affirm causality and be rid of the notion of intrinsic randomness.
The Many Worlds "multiverse" is one explanation of what I term the noumenal world. But this interpretation has its problems, as I discuss in Toward.
Yet it is hard not to make common cause with the Many Worlds defenders, arguing that randomness run amok does not seem an appropriate representation of the universe.
Now one may argue that probability questions should not be posed in such cases as Jevons mentions. Yet if we remember to regard answers as tools for guidance, they could possibly bear some fruit. However, an astronomer might well scorn such questions because he has a fund of possibly relevant knowledge. And yet, though it is hard to imagine, suppose life or death depends on a correct prediction. Then, if one takes no guidance from a supernatural higher power, one might wish to use a "rationally" objective probability quantity.
Induction, the inverse method and the "urn of nature" may sound old-fashioned, but though the names may change, much of scientific activity proceeds from these concepts.
How safe is the urn of nature model?
Suppose there is an urn holding an unknown but large number of white and black balls in unknown proportion. If one were to draw 20 white balls in a row, "common sense" would tell us that the ratio of black balls to white ones is low (and Laplace's rule would give us a 21/22 probability that the next ball is white). I note that "common sense" here serves as an empirical method. But we have two issues: the assumption is that the balls are well mixed, which is to say the urn composition is presumed homogeneous; if the number of balls is high, homogeneity needn't preclude a cluster of balls that yields 20 white draws.
We need here a quantitative way to measure homogeneity; and this is where modern statistical methods might help, given enough input information. In our scenario, however, the input information is insufficient to justify the assumption of a 0.5 ratio. Still, a runs test is suggestive of non-randomness in the sense of a non-0.5 ratio.
Another issue with respect to induction is an inability to speak with certainty about the future (as in, will the ball drop if I let go of it?). This in fact is known as is "the problem of induction," first raised by Hume.
To summarize some points previously made, induction is a process of generalizing from the observed regularities. Now these regularities may be gauged by applying simple frequencies or by rule of succession reasoning, in which we have only inference or a propensity of the system, or by deductive reasoning, whereby we set up an algorithm that, when applied, accounts for a range of phenomena. Here the propensity is given a nonzero information value. Still, as said, such deductive mechanisms are dependent on some other information -- "primitive" or perhaps axiomatic propensities -- as in the regularities of gravity. Newton and Einstein provide nonzero framework information (propensities) leading to deductions about gravity in specific cases. However, the deductive system of Newton's gravitational equation depends on the axiomatic probability 1 that gravity varies by inverse square of the distance from a ponderable object's center of mass, as has, for non-relativistic magnitudes, been verified extensively with very occasional anomalies ignored as measurement outliers.
Popper in Schism takes note of the typical scientist's "metaphysical faith of the existence of regularities in the world (a faith which I share and without which practical action is hardly conceivable)" (43).
Measures of future uncertainty, such as the Gaussian distribution, "satisfy our ingrained desire to 'simplify' by squeezing into one single number matters that are too rich to be described by it. In addition, they cater to psychological biases and our tendency to understate uncertainty in order to provide an illusion of understanding the world," observed Benoit Mandelbrot and Nassim Taleb.
Some outliers are just too big to handle with a normal curve. For example, if a 300-pound man's weight is added to that of the weights of 100 other persons, he isn't likely to have substantial effect on the mean. But if Bill Gates's net income is added to the incomes of 100 other persons, the mean will be mean-ingless. Similarly, the Fukushima event, using Gaussian methods, was extraordinarily improbable. But the world isn't necessarily as Gaussian as we would like to believe. As said previously, one way to approach the idea of regularity is via pattern recognition matrices. If a sufficient number of entries in two matrices are identical, the two "events" so represented are construed as identical or similar to varying degrees between 1 and 0. But of course, we are immediately brought to the concept of perception, and so we may say that Popper has a metaphysical faith in the mind's reality sensing and construction process of most people, not including some severe schizophrenics. (See Toward.)
Imagine a situation in which regularities are disrupted by sudden jumps in the mind's reality constructor. Life under such conditions might be unbearable and require neurological attention. On the other hand, sometimes "miracles" or "works of wonder," are attested to, implying that some perceive occasional violation of the humdrum of regularities, whether this is a result of a psychosomatic process, wishful/magical thinking, or some sort of intersection with a noumenal world (Part VI, see sidebar).
The theme of "regularities" coincides with what Gott calls the Copernican principle, which I interpret as meaning a metaphysical faith that the rules of nature are everywhere the same (except perhaps in parallel universes).
Gott on the 'Copernican principle'
http://www-psych.stanford.edu/~jbt/224/Gott_93.pdf
It is important to face Hume's point that scientific ideologies of various sorts rest upon unprovable assumptions. For example, the Copernican principle, which Gott interprets as meaning that a human observer occupies no special time or place in the cosmos, is a generalization of the Copernican-Galilean model of the solar system. Interestingly, by the way, the Copernican principle contrasts with the anthropic cosmological principle (discussed later).
Einstein's belief in what must be considered a form of Laplacian realism is put in sharp relief with this assertion:
“The only justification for our concepts and system of concepts is that they serve to represent the complex of our experiences; beyond this they have no legitimacy. I am convinced that the philosophers have had a harmful effect upon the progress of scientific thinking in removing certain fundamental concepts from the domain of empiricism, where they are under our control, to the intangible heights of the a priori. For even if it should appear that the universe of ideas cannot be deduced from experience by logical means, but is, in a sense, a creation of the human mind, without which no science is possible, nevertheless this universe of ideas is just as little independent of the nature of our experiences as clothes are of the form of the human body” (44).
The problem of induction is an obstacle for Einstein. Scientific inquiry requires that it be ignored. However, one might say that this irrational rationalism led to a quagmire that he was unable to see his way past, despite being a friend and colleague of the physicist-turned-logician Kurt Goedel, who had strong reservations about what is sometimes termed Einstein's naive realism.
Another take on this subject is to make a formally valid statement: A implies B, which is to say, "If A holds, then so does B." So if one encounters A as being true or as "the case," then he can be sure that B is also the case. But, at some point in his chain of reasoning, there is no predecessor to A. So then A must be established by induction or accepted as axiomatic (often both) and not by deduction. A is not subject to proof within the system. Of course this is an elementary observation, but those who "believe in Science" need to be reminded that scientific method is subject to certain weaknesses inherent in our plane of existence.
So we tend to say that though theories cannot be proved true, there is a level of confidence that comes with how many sorts of phenomena and how many special cases are successfully predicted by a theory (essentially, in the "hard" sciences, via an algorithm or set of algorithms).
But the fact that some theories are quite successful over a range of phenomena does not mean that they have probabilistically ruled out a noumenal world. It does not follow that successful theories of phenomena (and they are not fully successful) demonstrate that a noumenal world is highly improbable. In fact, David Bohm's struggles with quantum theory led him to argue that the world of phenomena must be supplemented by a noumenal world (he did not use that term) that permits bilocality via an "implicate," or unseen, order.
The noumenal world is reflected by our attempts to contend with randomness. The concept of pure randomness is, I would say, an ideal derived from our formalizations of probability reasoning. Consider a notionally constructible binary string, in which each unit is selected by a quantum detector. For example, if the algorithm clock is set for 1 second intervals. A detection of a cosmic ray in that period is recorded as a 1, whereas no detection during the interval receives a 0.
If this algorithm runs to infinity, we have a probability 1 of every possible finite substring appearing an infinity of times (ignoring the presumed change in cosmic ray distribution tens of billions of years hence). This follows from (1-p)n = 1 as n goes infinite. So, for example, Fred Hoyle noticed that if the cosmos is infinite in size, we would expect an infinity of Fred Hoyles spread across the cosmos.
But, unless you are quite unusual, such a possibility doesn't accord with your concept of reality, does it? You have an inner sense here that we are "playing with numbers." And yet, in the Many Worlds interpretation of quantum physics, there is either an infinite or some monstrously large set of cosmoses in which versions of Fred Hoyle are found many times over -- and remarkably, this multiplicity scenario was developed in order to affirm causality and be rid of the notion of intrinsic randomness.
The Many Worlds "multiverse" is one explanation of what I term the noumenal world. But this interpretation has its problems, as I discuss in Toward.
Yet it is hard not to make common cause with the Many Worlds defenders, arguing that randomness run amok does not seem an appropriate representation of the universe.
Go to Chapter 7 HERE.
No comments:
Post a Comment