Loading ...
Sorry, an error occurred while loading the content.

5194Re: quasi-random numbers

Expand Messages
  • Bob McKay
    May 4, 2010
      Hi David, sorry, I was previously commenting more on the general discussion than on your original question. I agree that you need to tailor the sequence dimensionality to the problem.

      I haven't worked on using LD sequences after initialisation, but I would imagine it might be again appropriate to consider our overall goal. At least in comma algorithms, where we assume all information is held in the current population, I think we would probably want to restart sampling each generation. But maybe it's a bit more complex than that. For example, if we were wanting a Gaussian bias for mutation in a GA, we should presumably use something ike a Gaussian transformation of an LD sequence, similarly for crossover. However I'm less clear on how this might appropriately work in GP. Not too sure that it is easy to define what it is we want GP to do "even sampling" over.

             Best Wishes


      My question that started this discussion was in fact driven by concerns with initializing an EA. I assume that if the genome has 40 real numbers, I would use a QR sequence that was good in 40 dimensions.

      I'm not clear on using a QR sequence later as part of a mutation operator. A mutation operator that looks at each real and decides to mutate it - does that use the same 40-dimension sequence as the initialization, or a one dimensional sequence? Do I need to work backward from the expected number of function evaluatons to the expected number of calls to the mutation operator, and then generate a seqence with that number of items? I'm not sure that I can just substitute a call to a QR function for a call to a PRNG.

      David vun Kannon

    • Show all 3 messages in this topic