Loading ...
Sorry, an error occurred while loading the content.

25579Re: Multiverse Morality

Expand Messages
  • hibbsa
    Aug 30, 2013
    • 0 Attachment
      --- In Fabric-of-Reality@yahoogroups.com, Gary Oberbrunner <garyo@...> wrote:
      >
      > On Fri, Aug 30, 2013 at 12:04 PM, hibbsa <hibbsa@...> wrote:
      > > Gary - you are too intelligent for me today...I was getting
      > > confused. So I'll try again tomorrow or day coming soon.
      >
      > Please do -- I want to understand your ideas.
      >
      > > You're a smart guy with a dumb theory :o)
      >
      > Could easily be. At least the dumb theory part.
      >
      > > Leaving you with this: It's pretty common in science that a theory
      > > with a lot of reach has humble beginnings in something
      > > parochial. Stealing word style from Deutsch here. In the best cases
      > > that local thing implies the wider implication, which leads to a
      > > process culminating in a mathematical theory that starts in that wider
      > > place and eventually finds its way back to the parochial place.
      >
      > > And so, a few months back I asked this list, whether anyone was
      > > trying to produce that mathematical theory of the MWI? Sure MWI is
      > > inferred from QM, but that doesn't matter at all. QM is parochial,
      > > it implies MWI, MWI is subjected to scientific rigour and genius,
      > > and a mathematical theory pops out the other end. Maybe profound
      > > non-trivial predictions are made not only about the multiverse but
      > > also about things we'd never even imagined right here. The
      > > predictions hold up, which drive a scientific revolution, which
      > > eventually sees the greatest ever technological revolution out of
      > > which humans come riding firmly in the saddle of the force of
      > > gravity itself. Gravity drives, warp drives, gravity weapons, maybe
      > > even a special dark door portaling this world to the next with more
      > > thereafter to the end of the multiverse.
      >
      > > Alan batted this away saying QM implied MWI not the other way
      > > around. I did mention at the time I didn't think that a legitimate.
      >
      > You are correct in my view. MWI is only one interpretation of QM.
      > But MWI is principally a mathematical theory -- we put words around
      > that mathematical explanation to help tell the story. But the math
      > _is_ the ground truth. (This bears on your other posting as well --
      > all true scientific explanations are primarily mathematical.) You
      > could say that MWI is more strictly mathematical than Copenhagen,
      > since Copenhagen also postulates that quantum states collapse when
      > observed, and it has no mathematical model for how and when this
      > happens. MWI has no such mathematical inconsistency.

      Gary - MWI is not defined with a set of equations that I know of...do you have a link?

      Perhaps you are attributing QM mathematics to MWI. Which IMHO would be wrong, even if it was 'right' that you could do it. QM is parochial, thus so is its mathematics. What I am talking about is a set of equations describing the multiverse itself. At least one falsifiable prediction would be possible as the consequences of such a set of equations. And that would be the prediction of how souls within individual universes would experience the multiverse. Now if the multiverse equations could predict, literally predict, the precise form of Quantum Mechanics....or even go further and predict macroscopic concepts like Space Time. Now that would describe the ascent into Science of MWI. There would also be a Nobel or two in the offing.



      >
      > > How do you feel about it? And how is the progress coming along?
      >
      > I'm an engineer. Not a scientist and certainly not a philosopher,
      > except insofar as I care deeply about certain problems in those
      > areas. And from that engineering point of view, the entire field of
      > quantum computing, which is admittedly still small but is making
      > (ahem) quantum leaps every year, does not really exist without a
      > decent (non-Copenhagen) interpretation of QM.

      I don't understand the proposition of quantum computing very well. If it involves using the superposition, then I would like to register a personal prediction right here, that quantum computing won't ever happen on those terms.

      The reason I think that, is that from a perspective of QM as one side something that the Big Bang becomes the other side, which is in fact inherently a position that requires the 'bold conjecture' that our universe came about by a process very much in analogue to Darwinian evolution (which kicks off a process that inevitably requires the emergence out of the Big Bang, as a 'development' or 'gestation' event, in that the actual evolution of all this took place back through pre-Big Bang history, as a progression starting with universes that flashed in and out of existence, to ones that lasted a bit longer, to others yet more enduring, probably thousands or millions of rounds, to get us here.

      I can't be arsed trying to ezplain why that is for now, but cross me kippers hope to die, it comes out of a very hard set of constraints.

      So anyway...that would be a very basic level illustration of the reality that I see, and in that reality QM never gets explained directly at all. What happens instead is that QM equations, because they are so accurate at predicting reality, becomes a surrogate for an empirically observable patch of reality, which then - just exactly the same way that 18th century chemistry came to be - that surrogate empirically observable landscape is paired to an embryonic, highly vague, explanatory conception, in such way that, the components of the conceptions can be played around with, until some small prediction falls out about quantum level reality, which can in turn be tested using that surrogate empirical landscape.

      The process has to start like that, and for a long while it amounts to little better than attempts to mirror aspects of QM using a different conceptual model. The goal at that stage would be to arrange the embryonic theory in such way that, at any given point, at any given stage, there is precisely one logical consequence, that is not contradicted, that translates into a prediction about quantum behaviour.

      Who knows if it would be successful and go all the way. But what can be said, is that in order to be successful the process *has* to ultimately correspond to a history of predictions/falsifications, in which predictions progressively got larger and more sophisticated, such that the overall distribution by prediction size tended to the exponential.

      As such exponential distributions (I.e. asteroids in the solar system are exponenentiall distributed), ordinary end up with one or two really big ones at the top level. It is at this level that the falsification event occurs. That falsification event is hugely important, to the process and to science itself. The problem is though, almost everyone has forgotten what a falsification event actually defines in science.

      It needs to be remembered. And what it is, is actually not the falsification event itself, but that whole process of progressively reflecting an observable section of reality into a paired structure that is very different and based on very different conceptions, but never the less in some sense at some level exactly mirrors that quantum landscape. In that, a translation procedure would be possible, that was self-consistent, in which one side of the pair could be translated precisely into the other side.

      And that is what used to happen in Science. It never got written down or defined, because when you are in that process your intuition for it grows with the predictions. But that is how the really big prediction came about. It was 'grown' organically. And it had to be too, because for a prediction to be valuable in science that prediction has to derive out of the core of a theory such that everything about that prediction, reflects some aspect of that theory. And then, that prediction also has to perfectly marry up with some section of empirical reality that can be observed to a sufficient resolution, to confirm or falsify each part of the prediction.

      Those two conditions need to be in place for the prediction to be meaningful both in terms of the theory and in terms of observed landscape. But there are further requirements. The prediction has to include components that blow out at new levels, such that it is now telling us something about that landscape, that is both substantial, and that is now running dramatically ahead of current technology, thus cannot be confirmed or falsified...yet.

      Or that would be the first class variation. The second class is where the prediction tells us something is somewhere, that we hadn't thought to look, but that we can immediately go check on.

      But Deutsch if he was reading this would be shaking his head and asking why these prediction events are important. Well, the answer is nothing other than, they aren't important save in two highly parochial ways.

      1. They are important in that the nature of the process of discovery, is totally dependent on the character and discipline of the individual, and his commitment to truth seeking. This is because, the process can be operated trivially as simply a trial and error based construction of some mirror effect to what is observed, that does the job, but adds absolutely nothing new because it has no explanatory value all its own. Just a mirroring exercise.

      In fact, because that can be the reality even the best intentions, it becomes necessary to ASSUME the whole process was trivial with no knowledge created. However, there is one way, and one way only, that can differentiate between instances that create new knowledge and instances that don't. And that is, the simple proposition that instances that create fundamental new knowledge, and given the process involves an intimate pairing of that observable landscape and a theoretical construct, it should be equally possibly for the theoretical process to run ahead of the empirical, as the other way around.

      And this is why that falsification event is so fundamental. Not for large scale philosophical reasons, but for local scale procedural reasons, in that this is the only way to distinguish a process that apes out an explanationless construct, and a process that produces a construct that says something real about that observable landscape, that could not possibly be explained as an artefact of something that landscape said about the theoretical construct earlier on.

      That's the nature and place of prediction. Or that's where it begins. But of course predictions accumulate new value as they go along. They tell us something new don't forget. Then they probably drive empirical improvements in the direction of accomplishing the resolution to see if the prediction is right.

      Which is of course the seeding point, across which science begins to make the leap into technology. It always starts with the local value it brings to the empirical challenge to check it. But that process, in some instances naturally knocks on into other things. Which if such knock-ons exhibit healthy feedbacks and self-reinforcement can sometimes form a recurrent mechanism that drives improvements both to the prediction, the underlyhing theory and the empirical challenge of observing the prediction.

      And in just a few amazing events in history, this process has just kept going until - sometimes - the resolution and precision of certain predictions and their manner of checking have converged onto some physical domain which that new resolution shines a torch on some new potential for manipulating the flow of energy.

      And when that happens, if everything is robust, that potential then becomes yet another progression of discovery, that if successful culminates in a prediction for a mechanical bolt on to reality through which that manipulation can happen. And that's a technological revolution. Far from being some incidental event, when it happens it is the most profound event in the history of that whole progression that set out as a vague theory structure and an observable landscape.

      It isn't secondary, it's profound. Which is why the concept of Progress is profound to the nature of what science is. Science is much more like a living organism than any other human created organized structure. Science is its dynamical progression, and that progression defines the heartbeat and blood of science. Science was born. It had a youth. Frequently went wayward, but always found its way home again. Science can die. Just like we can, and defined just the same way. Science begins to die when that progression begins to tail off. The death of science isn't a dead body, it is simply its translation back into Philosophy.

      Already at the scientifixc frontier, a major dying off is well underway. But while there is a pulse someone, and anywhere in the body of science, the process of death is not finalized, and hope remains. At the moment we're basically safe because at the other end of science: there is an empirical revolution taking place, such that, paradoxically given the morbidity at the frontier, the revolution has never been so strong, the new possibilities whizzing by such a blur. But that flame will only carry on so long absent a theoretical frontier seeing progress. Empirical science cannot create theories of the calibre that we need. Theories can only come by the pairing process already mentioned, which currently can only take place with the mindseye of a human being within a rational process.
    • Show all 35 messages in this topic