18655Re: [MAGIC-list] Re: An argument against intelligent design (the simulation argument)
- Feb 7, 2014[crossposting to ai-philosophy and analytic]Thanks Gabriel!I'm pleased when somebody reserves the time to read a paper that serves only to extend our philosophical understanding, and not making money or any personal benefits as such.There are probably weak points in the paper, but not anything major that I can see. It boils down to a very simple reason: that the bland indifference principle doesn't work here.The implausibility comes from:a) information-theoretic incompleteness necessitated by the quantum simulation.a1) I argue explicitly that to try to "detect attempts to detect the simulation" and prevent them itself is infeasible, probably harder than the quantum simulationb) the underspecification of cosmology:b1) What is the size of the "real" universe?b2) What is the age of the "real" universe?Since these are not adequately addressed by the simulation argument, the whole argument boils down to a quite unscientific rephrasing of young earth creationism: that this earth is 6000 years old and it's planted with dinosaur fossils to deceive us.Of course what Bostrom seems to believe in is much odder, judging from the most probable scenarios he outlines in his paper. He seems to believe that the world is a 100 year old simulation (in simulated time, in reality, much shorter!), and only the earth and the local solar neighborhood are simulated, and that, basically, the scenario of the movie The Matrix is true.Regards,On Mon, Jan 20, 2014 at 3:53 AM, Gabriel Leuenberger <gabriel.leuenberger@...> wrote:I also want to say that Eray did a good job of pointing out that realistic simulations within simulations within simulations are implausible.And my hypothesis about the length of the third part of the program is not based on any evidence.On Monday, 20 January 2014 02:42:32 UTC+1, Gabriel Leuenberger wrote:I think it would be more rigorous if we use a combination of Hutter's "observer localization" and Orseau's "space time embedded agents" to debunk Bostrom's thesis.Assuming there's a 'real' universe which contains real humans but also contains computers with simulated humans.Let's assume the software has a good enough AGI to run a very efficient and realistic simulation (which would give a whole new meaning to panpsychism).At present the simulated humans have no way of telling if they live in a simulation or not.Now we should try to argue that the shortest description of a conscious brain state of a simulated human is longer than the shortest description of a conscious brain state of a real human. By conscious brain state I mean data which represents a connectome and the corresponding brain activity of a thinking brain. The shortest description of this data would be a program composed of these three parts: 1: ToE, 2: Observer Localisation, 3: a program which transforms the physical data into the brain state information.My hypothesis is that the third part of the program would be longer for sophisticatedly simulated humans. It would therefore be less probable that we are part of such a simulation.But then there's this paradox: the simulated philosophers would also arrive at the conclusion that they are not simulated. The solution to this paradox is that an unlimited AIXI would know whether it lives in the simulation or not, because the simulation is different than reality. But the simulated philosophers don't have enough computational resources to come to this conclusion.On Monday, 13 May 2013 02:45:47 UTC+2, Eray Özkural wrote:
I had written, a couple of months ago, an essay on the simulation
argument. It contains a *generic* argument against any version of
intelligent design, based on induction and AI. I think I had explained
this argument to Laurent Orseau as well during AGI-12. Let me know
what you think about it!
Before posting, please read this: https://groups.google.com/forum/#!topic/magic-list/_nC7PGmCAE4
You received this message because you are subscribed to the Google Groups "MAGIC" group.
To unsubscribe from this group and stop receiving emails from it, send an email to magic-list+unsubscribe@....
To post to this group, send email to magic-list@....
Visit this group at http://groups.google.com/group/magic-list.--
Eray Ozkural, PhD. Computer Scientist
Founder, Gok Us Sibernetik Ar&Ge Ltd.
- << Previous post in topic Next post in topic >>