@Holiday20310401,
Holiday20310401;20943 wrote:"whereby system behavior moves from steady-state to extremely complex patterns with increasingly random elements"
What is a random element? I thought randomness was about perception and cognition abilities.
Though naturally there is no pure randomness I can't see complexity adding to randomness in any way, just requiring a better understanding of the system to see that the randomness hasn't at all been influenced.
H20310401,
With regard to randomness, I was not thinking of the maximum metaphysical concept of complete lack of order. In that case, a continuing phenomenon would be completely unbounded; its upcoming values and states could then change to any state or number allowed in our universe (or NOT allowed, if COMPLETELY COMPLETELY unbound -- a universe that allows all and indeed requires infinities). I am using the more limited concept of randomness that is used in statistics, i.e. randomness that is bounded by higher-order regularities, e.g. probability distributions. This kind of randomness says that there is no way to know just what the next value or state is going to be; but there are ways to determine the probability of any possible future state. In a number chain, this means that the only way to tell someone else how that chain of numbers has turned out is to sent all of the numbers produced. There is no way to shorten it, no way to convey it with fewer digits of information. By contrast, take the case of the number sequence 1, 1, 2, 3, 5, 8, 13, . . . . Say that you've let it go out to 1000 numbers; do you need to send enough digits to convey 1000 individual numbers? No, you just say "starts with 1, then 1, then every following number equals the addition of the past two, continue 998 times." Much more economical - and not random.
However, even in the case where there is no way of saying what the next number will be based upon previous number events (i.e., randomness as I am using the term), there may still be probabilities and aggregate relationships. E.g., if the series is a random coin toss where 0 is heads and 1 is tails, and the coin is weighted such that heads comes up 60% of the time and tails occurs 40% of the time, you can't say what the next number will be based on the chain of past numbers, but you can say that the chances of a 0 are 60%, and the aggregate count of 0's relative to all numbers is expected to be close to 60% (closer and closer as the total of all numbers increases).
As to complexity and this kind of limited randomness: my rough intuition is that a more complex mechanism or process gives increased opportunities for a steady-state system to enter a more highly random state; again, this randomness being the mild version, and not the "whole world has gone crazy" variety. A rough comparison is a fast flowing, unobstructed river. It's pretty easy to predict the progress of any particular water molecule when the river has no obstacles. The river is a fairly simple system at that point. Put in one or two obstructions, e.g. a bridge pier, and the water flows can still be easily understood. But make the river go over a precipice with rocks and boulders scattered on it, and water starts splashing in all sorts of directions. Imagine a string of water molecules following each other down the river, nice and orderly; but they go in different directions once they reach the "complex" part, i.e. the boulder-strewn waterfalls. One goes one way, the one right behind goes another, no one can say why. In the aggregate, on average, you can say that they keep going forward, they eventually continue downstream from the waterfalls; but each molecule's individual path through the rocks and falls cannot be predicted based on where the ones ahead of it went. The rocks and precipice has put the water / river system into temporary chaos. That, roughly speaking, is an example of complexity causing chaos and increased randomness.
(Please excuse the 101 level of these explanations; you've heard them before, I'm repeating them mostly for my own sake).
The brain might be compared to that river, a river of information signals (mediated by the firing of ion potentials through neurons and across synapses). In some instances, the flow is slow and relatively unimpeded, and the randomness is low. But increase the flow and direct it through complex junctions and processes, and a point of chaos transition can occur. E.g., put a person under a lot of pressure and confusing circumstances, and their behavior can become unpredictable.
As to your musings about the ULTIMATE nature of randomness and the ULTIMATE nature of the universe and the ULTIMATE nature of consciousness - well, I'm not sure that I completely understand, but I am interested. My rough "folk" notion is that consciousness requires a mix of randomness and determinism - i.e., "deterministic chaos". Or, coming at it the other way, chaos creating order through emergence. This is something of a complementarity of contradictions, a yin-yang process of state transitions.
The purest of randomness would arguably require some forms of infinity, which doesn't seem to exist in our universe, either on the big side (the relativism limit to light speed) or the small (quantum limits on smallness). The purest of determinisms, the Laplacian block universe, would arguably not require consciousness, given the general ("folk") association of consciousness and free will on an essence/essential level. We perceive a world with conscious awareness but no infinities, with randomness and yet with finite predictabilities. Bounded potential and unconstrained possibility ARE tangled, and are only unfolded in the realm (i.e., abstract "state-space" that is non-4D or trans-4D) of consciousness. And in that process, in that tension between the random and the determined, somehow consciousness gives the physical entity that hosts it information about "the value of being" (with apologies for lack of a better, less New-Agey phrase), information that arguably has survival value for the being and its species. IMHO.
Jim G.