Loading ...
Sorry, an error occurred while loading the content.

Re: [GP] Genetic Programming vs Algorithms?

Expand Messages
  • Sean Luke
    ... Sure, I think so. Note that the definition also probably applies to learning classifier systems, and to CAs evolved by GA etc. Una-May s term executable
    Message 1 of 30 , Dec 2, 2003
    • 0 Attachment
      On Tuesday, December 2, 2003, at 01:38 PM, Mike Eggleston wrote:

      > On Tue, 02 Dec 2003, Sean Luke wrote:
      >
      >> Although I personally think that the following definition is better:
      >>
      >> Genetic programming is a community of researchers interested in
      >> applying evolutionary computation to the discovery of solutions in the
      >> form of computational devices.
      >
      > Does the more robust definition also hold true for
      > the automatic design of electronic circuts (or ants, etc)?

      Sure, I think so.

      Note that the definition also probably applies to learning classifier
      systems, and to CAs evolved by GA etc.

      Una-May's term "executable structures" might also work instead of
      "computational devices".

      Sean
    • Candida Ferreira
      ... Evolving polynomials for symbolic regression is a form of function optimization, not plain parameter optimization (also a form of function optimization).
      Message 2 of 30 , Dec 3, 2003
      • 0 Attachment
        C. Setzkorn wrote:


        > I dont get you here Candida. Lets say we evolve a polynomial using GP
        > (e.g. for regression) then we clearly have to optimise the parameters of
        > the polynomial. The good thing about GP is that it can easily drop terms
        > and does not have to evolve the corresponding parameters to zero.
        > Hence the search space can be restricted through structural
        > optimisation. One can also mutate terminals (parameters) by adding some
        > random noise as in (real valued) GAs.
        >
        > Candida Ferreira wrote:
        >
        > >John Koza wrote:
        > >
        > >
        > >
        > >>The posting below asserts
        > >>"GAs are good at parameter optimization, a task GP cannot handle, ..."
        > >>
        > >>This is not true. There are hundreds of examples (both in my books and
        > >>elsewhere) from dozens of different areas where parameter optimization takes
        > >>place inside a GP run. To take just a few examples, there is parameter
        > >>optimization that takes places (simultaneously with topology creation) in
        > >>problems involving circuits, controllers, antennas, chemical networks,
        > >>etc. --- to saynothing of parameter optimization that takes places in just
        > >>plain old mathematical expressions in symbolic regression runs.
        > >>
        > >>
        > >
        > >You are confusing plain parameter optimization with function optimization. The parameter optimization alluded to in my previous email is the one involving minimum or maximum seeking, a task handled particularly successfully by GAs and now, I might add, by GEP.
        > >

        Evolving polynomials for symbolic regression is a form of function optimization, not plain parameter optimization (also a form of function optimization). In parameter optimization you evolve the coefficients of the parameters of the function being optimized so that the output of the function is the smallest possible.

        Candida Ferreira

        -----------------------------------------------------------
        Candida Ferreira, Ph.D.
        Chief Scientist, Gepsoft
        73 Elmtree Drive
        Bristol BS13 8NA, UK
        ph: +44 (0) 117 330 9272
        http://www.gepsoft.com/gepsoft
        http://www.gene-expression-programming.com/author.asp
        -----------------------------------------------------------
      • voyantix
        Greetings, It is not difficult to demonstrate that the statement: GAs are good at parameter optimization, a task GP cannot handle, ... is false. If using any
        Message 3 of 30 , Dec 3, 2003
        • 0 Attachment
          Greetings,

          It is not difficult to demonstrate that the
          statement:

          "GAs are good at parameter optimization, a task GP cannot handle, ..."

          is false.

          If using any general GP system we could emulate GA then
          that would be sufficient to prove the falsehood of the claim.

          For instance, for binary-string GA:

          Choose the function set: FONE, FZERO each with one argument.
          The functions return the eval of their argument concatenated
          with their own (1 and 0 respt).

          Choose the terminal set: TONE, TZERO which return 1 or 0 respt.

          Build/create trees with the FULL method, up to depth N (equivalent
          to the lenght of the GA strings). The fitness eval function should
          interpret each tree in exactly the same way as traditional GA would
          process the strings.

          With the standard GP genetic operators, this scheme emulates
          binary string GA. This scheme can be easily generalized to any
          kind of GA representation, ie, non-binary.

          Regards,

          David Anisman


          --- In genetic_programming@yahoogroups.com, "Candida Ferreira"
          <candidaf@g...> wrote:
          >
          > John Koza wrote:
          >
          > > The posting below asserts
          > > "GAs are good at parameter optimization, a task GP cannot handle,
          ..."
          > >
          > > This is not true. There are hundreds of examples (both in my
          books and
          > > elsewhere) from dozens of different areas where parameter
          optimization takes
          > > place inside a GP run. To take just a few examples, there is
          parameter
          > > optimization that takes places (simultaneously with topology
          creation) in
          > > problems involving circuits, controllers, antennas, chemical
          networks,
          > > etc. --- to saynothing of parameter optimization that takes places
          in just
          > > plain old math

          ematical expressions in symbolic regression runs.
          >
          > You are confusing plain parameter optimization with function
          optimization. The parameter optimization alluded to in my previous
          email is the one involving minimum or maximum seeking, a task handled
          particularly successfully by GAs and now, I might add, by GEP.
          >
          > Cheers,
          > Candida Ferreira
          >
          > -----------------------------------------------------------
          > Candida Ferreira, Ph.D.
          > Chief Scientist, Gepsoft
          > 73 Elmtree Drive
          > Bristol BS13 8NA, UK
          > ph: +44 (0) 117 330 9272
          > http://www.gepsoft.com/gepsoft
          > http://www.gene-expression-programming.com/author.asp
          > -----------------------------------------------------------
        • C. Setzkorn
          ... In regression you optimize the parameters of your model (e.g. the parameters of the polynomial if you have chosen a polynomial as your model) Good fit of
          Message 4 of 30 , Dec 3, 2003
          • 0 Attachment
            >
            >
            >Evolving polynomials for symbolic regression is a form of function optimization, not plain parameter optimization (also a form of function optimization). In parameter optimization you evolve the coefficients of the parameters of the function being optimized so that the output of the function is the smallest possible.
            >
            In regression you optimize the parameters of your model (e.g. the
            parameters of the polynomial if you have chosen a polynomial as your model)

            Good fit of the model can be measured by the MSE or the absolute error.

            I don't see a difference ... sorry.

            Chris

            >
            >Candida Ferreira
            >
            >-----------------------------------------------------------
            >Candida Ferreira, Ph.D.
            >Chief Scientist, Gepsoft
            >73 Elmtree Drive
            >Bristol BS13 8NA, UK
            >ph: +44 (0) 117 330 9272
            >http://www.gepsoft.com/gepsoft
            >http://www.gene-expression-programming.com/author.asp
            >-----------------------------------------------------------
            >
            >
            >
            >
            >To unsubscribe from this group, send an email to:
            >genetic_programming-unsubscribe@yahoogroups.com
            >
            >
            >
            >Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
            >
            >
            >
            >
          • Candida Ferreira
            I m sure that with a little ingenuity you will be able to emulate even Microsoft Excel in GP. Candida Ferreira ... Candida Ferreira, Ph.D. Chief Scientist,
            Message 5 of 30 , Dec 3, 2003
            • 0 Attachment
              I'm sure that with a little ingenuity you will be able to emulate even Microsoft Excel in GP.

              Candida Ferreira

              -----------------------------------------------------------
              Candida Ferreira, Ph.D.
              Chief Scientist, Gepsoft
              73 Elmtree Drive
              Bristol BS13 8NA, UK
              ph: +44 (0) 117 330 9272
              http://www.gepsoft.com/gepsoft
              http://www.gene-expression-programming.com/author.asp
              -----------------------------------------------------------

              ----- Original Message -----
              From: "voyantix" <voyantix@...>
              To: <genetic_programming@yahoogroups.com>
              Sent: Wednesday, December 03, 2003 9:29 PM
              Subject: Re: [GP] Genetic Programming vs Algorithms?


              > Greetings,
              >
              > It is not difficult to demonstrate that the
              > statement:
              >
              > "GAs are good at parameter optimization, a task GP cannot handle, ..."
              >
              > is false.
              >
              > If using any general GP system we could emulate GA then
              > that would be sufficient to prove the falsehood of the claim.
              >
              > For instance, for binary-string GA:
              >
              > Choose the function set: FONE, FZERO each with one argument.
              > The functions return the eval of their argument concatenated
              > with their own (1 and 0 respt).
              >
              > Choose the terminal set: TONE, TZERO which return 1 or 0 respt.
              >
              > Build/create trees with the FULL method, up to depth N (equivalent
              > to the lenght of the GA strings). The fitness eval function should
              > interpret each tree in exactly the same way as traditional GA would
              > process the strings.
              >
              > With the standard GP genetic operators, this scheme emulates
              > binary string GA. This scheme can be easily generalized to any
              > kind of GA representation, ie, non-binary.
              >
              > Regards,
              >
              > David Anisman
              >
              >
              > --- In genetic_programming@yahoogroups.com, "Candida Ferreira"
              > <candidaf@g...> wrote:
              > >
              > > John Koza wrote:
              > >
              > > > The posting below asserts
              > > > "GAs are good at parameter optimization, a task GP cannot handle,
              > ..."
              > > >
              > > > This is not true. There are hundreds of examples (both in my
              > books and
              > > > elsewhere) from dozens of different areas where parameter
              > optimization takes
              > > > place inside a GP run. To take just a few examples, there is
              > parameter
              > > > optimization that takes places (simultaneously with topology
              > creation) in
              > > > problems involving circuits, controllers, antennas, chemical
              > networks,
              > > > etc. --- to saynothing of parameter optimization that takes places
              > in just
              > > > plain old math
              >
              > ematical expressions in symbolic regression runs.
              > >
              > > You are confusing plain parameter optimization with function
              > optimization. The parameter optimization alluded to in my previous
              > email is the one involving minimum or maximum seeking, a task handled
              > particularly successfully by GAs and now, I might add, by GEP.
              > >
              > > Cheers,
              > > Candida Ferreira
              > >
              > > -----------------------------------------------------------
              > > Candida Ferreira, Ph.D.
              > > Chief Scientist, Gepsoft
              > > 73 Elmtree Drive
              > > Bristol BS13 8NA, UK
              > > ph: +44 (0) 117 330 9272
              > > http://www.gepsoft.com/gepsoft
              > > http://www.gene-expression-programming.com/author.asp
              > > -----------------------------------------------------------
              >
              >
              >
              > To unsubscribe from this group, send an email to:
              > genetic_programming-unsubscribe@yahoogroups.com
              >
              >
              >
              > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
              >
              >
              >
            • Mariusz Nowostawski
              Hi David, Your proof has a mistake and is invalid. ... This is correct. Hovewer, to emulate GA in GP in the context of this discussion is a bit
              Message 6 of 30 , Dec 3, 2003
              • 0 Attachment
                Hi David,


                Your "proof" has a "mistake" and is invalid.


                voyantix wrote:
                > It is not difficult to demonstrate that the statement:
                >
                > "GAs are good at parameter optimization, a task GP cannot handle, ..."
                >
                > is false.
                >
                > If using any general GP system we could emulate GA then
                > that would be sufficient to prove the falsehood of the claim.

                This is correct. Hovewer, to "emulate" GA in GP in the context of this
                discussion is a bit 'tricky'...

                Before we start, we need to agree with one simple assumption (axiom so
                to speak), to make the whole discussion worthwhile, and we have to agree
                with one terminological aspect of TREES vs STRINGS distinction:


                Assumption 1:

                GA == representation of the genome is a STRING. This STRING after
                'evaluation' represents a point in a (multidimensional) fitness
                landscape space.

                GP == representation of the genome is a TREE. This TREE after
                'evaluation' represents a point in a (multidimensional) programs space.
                The particular program's output represents a point in a
                (multidimensional) fitness landscape space.


                Assumption 2:

                STRING != TREE

                (in other words, if something is a STRING, it is not a TREE. If
                something is a TREE, it is by definiton not a STRING. Therefore, trees
                consisting only of one branch, are (according to Assumption 2) strings,
                and are not trees).


                If we assume 1 without assuming 2 we cannot continue. It would be like
                postulating that GP *IS* a special case of GA (which of course is true
                in some sense, but this case is least interesting here, and we would
                have nothing to talk about).

                It seems like the whole point of GP vs. GA disscussion is more or less
                like this "are tree-based representations 'different' to string-based
                representations, and if so, for what kind of problems one is superior to
                the other." If TREE-based representation *IS* superior in some
                problems to pure STRING-based representations (or other way round), from
                NFL theorem we *know* it must be inferior in some other problems (or
                other way round). If they both are just isomorphic to one another, than
                there is no difference to do GA or to do GP.


                One can map STRING to a tree, as STRING is actually a special kind of
                tree, and one can map TREE to a STRING, e.g. by using reversed polish
                notation. But mapping one representation into the other does not
                automatically mean the evolutionary dynamics is isomorphic in both
                cases. This is the whole point - the dynamics of the evolutionary
                search process - not the mappings of representations.

                Some claim the evolutionary search is isomorphic, some claim it is not.

                Your proof does not answer the question one way or the other unfortunatelly.

                > For instance, for binary-string GA:
                >
                > Choose the function set: FONE, FZERO each with one argument.
                > The functions return the eval of their argument concatenated
                > with their own (1 and 0 respt).
                >
                > Choose the terminal set: TONE, TZERO which return 1 or 0 respt.

                So far so good, but.... you have actually defined a representation
                method which produces STRINGS, therefore, it actually *IS* GA, not GP
                (see assumptions above).

                > Build/create trees with the FULL method, up to depth N (equivalent

                You actually building STRINGS, not TREES (see assumptions above).

                > to the lenght of the GA strings). The fitness eval function should
                > interpret each tree in exactly the same way as traditional GA would
                > process the strings.

                This actually *IS* GA, not GP (see assumptions above).

                > With the standard GP genetic operators, this scheme emulates
                > binary string GA. This scheme can be easily generalized to any

                This scheme, based on STRING representation, does not emulate GA in GP,
                it actually *is* GA.

                > kind of GA representation, ie, non-binary.

                > Regards,
                >
                > David Anisman


                --
                best regards
                Mariusz Nowostawski
              • Korneel Duyvesteyn
                Hi, ... What Candida means is that many things can be emulated but are simply not designed for it. If you solve a problem with GP with a function set
                Message 7 of 30 , Dec 3, 2003
                • 0 Attachment
                  Hi,

                  "Candida Ferreira" <candidaf@...> writes:

                  > I'm sure that with a little ingenuity you will be able to emulate even
                  > Microsoft Excel in GP.

                  Good luck trying to define a fitness-function that will select offspring
                  for "Excel-like" behavior !!!

                  What Candida means is that many things can be emulated but are simply not designed for it.
                  If you solve a problem with GP with a function set containing functions with 2 arguments using the Full method
                  without mutation, you can also use GA to solve this problem in a very sophisticated way (converting a binary tree into bits),
                  but GA  is just not the best approach to it.

                  And what Candida means (I think) by the difference between parameter and function optimization is for example, when GP
                  is used to generate the equation y = 0.75x + 0.6x^2
                  GP will probably easily find y = ax + bx^2 but will probably not find values for 'a' and 'b' as optimal as GA would, or am I mistaken?

                  Regards, Korneel.



                  -- Gordon D. Pusch  

                  perl -e '$_ = "gdpusch\@...\n"; s/NO\.//; s/SPAM\.//; print;'


                  Yahoo! Groups Sponsor
                  ADVERTISEMENT
                  2ef720d.jpg
                  []

                  To unsubscribe from this group, send an email to:
                  genetic_programming-unsubscribe@yahoogroups.com



                  Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.
                • Gordon D. Pusch
                  ... Good luck trying to define a fitness-function that will select offspring for Excel-like behavior !!! -- Gordon D. Pusch perl -e $_ =
                  Message 8 of 30 , Dec 3, 2003
                  • 0 Attachment
                    "Candida Ferreira" <candidaf@...> writes:

                    > I'm sure that with a little ingenuity you will be able to emulate even
                    > Microsoft Excel in GP.

                    Good luck trying to define a fitness-function that will select offspring
                    for "Excel-like" behavior !!!


                    -- Gordon D. Pusch

                    perl -e '$_ = "gdpusch\@...\n"; s/NO\.//; s/SPAM\.//; print;'
                  • Russ Abbott
                    Although the discussion has drifted a bit from GP vs. algorithms, here is a problem that might illustrate the difference. For an AI class, I made up a
                    Message 9 of 30 , Dec 3, 2003
                    • 0 Attachment
                      Although the discussion has drifted a bit from GP vs. algorithms, here is a problem that might illustrate the difference.
                       
                      For an AI class, I made up a two-player game. It is played on a network. Each side starts with N pieces (for some N) on the network nodes.  The players can move from one node to another by following the edges of the network,  (To make it more interesting, the players have separate graphs, although the sets of nodes in the two graphs overlap.)
                       
                      If a player lands on a node that contains pieces owned by the other player, the other player's pieces are removed from the game.  A player wins if he removes all of the other player's pieces or if the other player has no move because all his pieces are at dead-end node. 
                       
                      So this is something like checkers played on a pair of graphs with the following differences.
                      • Pieces are taken when you land on their square instead of jumping them.
                      • It is allowed to have multiple pieces on a square.
                      • There are no special pieces (like kings).
                      Since this was an AI class, the point was to illustrate minimax, which it does quite well.  It's amazing how difficult it is for a human to look ahead, especially on an unfamiliar board.  (The "board" consists of a pair of graphs, one for each player.)  But of course, a minimax player can look ahead very easily.
                       
                      Minimax is an algorithm. Although not trivial, it is not a very complex algorithm.  How would one attack this problem with GP?
                       
                      As a first approximation, one could fix the board and use GP to evolve a player for  that specific game.  
                       
                      I imagine it would be a lot more challenging to use GP to evolve a player for a game in which the board itself was part of the input.  Yet a graph is a data structure, and one could give the GP system a set of operators that let it move around on its input graphs. 
                       
                      Assuming that I'm right and that GP will not be successful at evolving a general player for this game, this suggests that GP is capable of evolving relatively simple functions but not very good at evolving algorithms.  For a further discussion of this point see: http://abbott.calstatela.edu/oogp/GuidedGP.doc.
                       
                       
                      For slightly more information about this game and the assignments associated with it, look at http://abbott.calstatela.edu/courses/cs_460.  This is assignments 9,10, and 11.  A minimax prolog player for the game is available at http://abbott.calstatela.edu/Courses/CS_460/Network%20Game/.
                       
                      To run the game, load prolog and then:
                       
                          ?-  testGame(<Player1>, [e, b], <Player2>, [c, b]).
                       
                      where <Player1> and <Player2> are one of human, random, or minimax.
                      There is a nice (free) prolog at: http://www.swi-prolog.org/.
                       
                      -- Russ Abbott
                       
                       
                    • Gordon D. Pusch
                      ... Korneel: I think your sarcasm detector needs a tune-up. My remark was not meant to be taken particularly seriously... BTW, was anyone else in the group
                      Message 10 of 30 , Dec 3, 2003
                      • 0 Attachment
                        Korneel Duyvesteyn <189770cd@...> writes:

                        >> "Candida Ferreira" <candidaf@...> writes:
                        >>
                        >>> I'm sure that with a little ingenuity you will be able to emulate even
                        >>> Microsoft Excel in GP.
                        >>
                        >> Good luck trying to define a fitness-function that will select offspring
                        >> for "Excel-like" behavior !!!
                        >
                        > What Candida means is that many things can be emulated but are simply not
                        > designed for it.

                        Korneel: I think your sarcasm detector needs a tune-up. My remark was not
                        meant to be taken particularly seriously...

                        BTW, was anyone else in the group annoyed by that 44k base-64 encoded JPEG
                        advertisement that Yahoo appended to Korneel's followup ??? Yahoo's ads
                        were annoying enough when they were short bits of HTML, but when they start
                        shoving binary images down our throats, I wonder if it's once again time
                        for the group to look for Yet Another Home --- before Yahoo decides to allow
                        JavaScript ads that could potentially contain viruses or "spyware"... >:-(


                        -- Gordon D. Pusch

                        perl -e '$_ = "gdpusch\@...\n"; s/NO\.//; s/SPAM\.//; print;'
                      • Martin Sewell
                        ... GP is a subset of, and more expressive than, GAs, though GAs are likely to be more efficient for some classes of problems. GEP, in contrast, is overtly
                        Message 11 of 30 , Dec 3, 2003
                        • 0 Attachment
                          At 21:26 02/12/2003 +0000, Candida Ferreira wrote:

                          >Korneel Duyvesteyn wrote:
                          >
                          > > Does anyone know a problem which GA can solve but GP cannot? (in a
                          > > reasonable way..)
                          >
                          >GAs are good at parameter optimization, a task GP cannot handle, although
                          >GEP, more related to GP than GAs, can handle this kind of problem very
                          >well thanks to its multigenic nature (see my book <SNIP>).

                          GP is a subset of, and more expressive than, GAs, though GAs are likely to
                          be more efficient for some classes of problems. GEP, in contrast, is
                          overtly more commercial.

                          Regards

                          Martin

                          Disclaimer: <NFL>For any algorithm, any elevated performance over one class
                          of problems is exactly paid for in performance over another class.</NFL>
                        • Mariusz Nowostawski
                          Hi Martin, Martin Sewell wrote: [...] ... I think I can understand what you try to communicate, but the sentence above is unfortunatelly not consistent. First,
                          Message 12 of 30 , Dec 3, 2003
                          • 0 Attachment
                            Hi Martin,

                            Martin Sewell wrote:
                            [...]
                            > GP is a subset of, and more expressive than, GAs, though GAs are likely to
                            > be more efficient for some classes of problems.

                            I think I can understand what you try to communicate, but the sentence
                            above is unfortunatelly not consistent.

                            First, if "GP is a subset of GAs", then GP by definition cannot be "more
                            expressive than GAs" -- because GAs, by definition would be a superset
                            containing GP+other stuff, therefore GA could express all GP can express
                            plus more, which GP cannot express.

                            Second, if "GP is a subset of GAs" then it is not only "likely", but it
                            is actually a simple fact (by definition) that GAs are "more efficient
                            for some classes of problems" -- because everything GP can do GAs can do
                            as well, but not all GAs can do can be repeated by GP with the same
                            efficiency.

                            As I said before, any talk on the subject "GP vs. GA" must be preceded
                            by the agreement on some basic understanding of the words being used in
                            the discourse - just by shuffling the words and using them in different
                            context with different semantics will lead nowhere.

                            >
                            > Regards
                            > Martin


                            --
                            best regards
                            Mariusz
                          • Martin Sewell
                            ... The above sentence is false. :-) ... More expressive , not expresses more . ... The qualifier likely is used because there exists a GA which would have
                            Message 13 of 30 , Dec 3, 2003
                            • 0 Attachment
                              At 15:36 04/12/2003 +1300, Mariusz Nowostawski wrote:
                              >Hi Martin,
                              >
                              >Martin Sewell wrote:
                              >[...]
                              > > GP is a subset of, and more expressive than, GAs, though GAs are likely to
                              > > be more efficient for some classes of problems.
                              >
                              >I think I can understand what you try to communicate, but the sentence
                              >above is unfortunatelly not consistent.

                              The above sentence is false. :-)

                              >First, if "GP is a subset of GAs", then GP by definition cannot be "more
                              >expressive than GAs" -- because GAs, by definition would be a superset
                              >containing GP+other stuff, therefore GA could express all GP can express
                              >plus more, which GP cannot express.

                              "More expressive", not "expresses more".

                              >Second, if "GP is a subset of GAs" then it is not only "likely", but it
                              >is actually a simple fact (by definition) that GAs are "more efficient
                              >for some classes of problems" -- because everything GP can do GAs can do
                              >as well, but not all GAs can do can be repeated by GP with the same
                              >efficiency.

                              The qualifier "likely" is used because there exists a GA which would have
                              an equivalent GP, for which the statement would be false. I also included
                              a disclaimer: "<NFL>For any algorithm, any elevated performance over one
                              class of problems is exactly paid for in performance over another class.</NFL>"

                              >As I said before, any talk on the subject "GP vs. GA" must be preceded
                              >by the agreement on some basic understanding of the words being used in
                              >the discourse - just by shuffling the words and using them in different
                              >context with different semantics will lead nowhere.

                              That's why I included "GP is a subset of [...] GAs".

                              Regards

                              Martin
                            • C. Setzkorn
                              ... Would be interesting to do a comparison. Is anyone aware of one? On the other hand, one could always add some noise to the terminal parameters similar to
                              Message 14 of 30 , Dec 4, 2003
                              • 0 Attachment
                                > And what Candida means (I think) by the difference between parameter and
                                > function optimization is for example, when GP
                                > is used to generate the equation y = 0.75x + 0.6x^2
                                > GP will probably easily find y = ax + bx^2 but will probably not find
                                > values for 'a' and 'b' as optimal as GA would, or am I mistaken?

                                Would be interesting to do a comparison. Is anyone aware of one?

                                On the other hand, one could always add some noise to the terminal
                                parameters similar to GAs with real values. Perhaps one could even
                                represent terminals as bit strings and perform GA crossover between the
                                terminals that two trees, which undergo GP crossover, have in common.

                                I guess one could also generate suitable models using GP and then
                                optimise its parameters using GA.

                                I suppose someone must have done something like this. I would be
                                grateful for any references. Many thanks in advance.

                                Regards,
                                Christian
                              • Mats Andrén
                                So... This seems to be a hot topic, with little agreement indeed. I can t help but think of Langdon and Polis book Foundation of Genetic Programming where
                                Message 15 of 30 , Dec 4, 2003
                                • 0 Attachment
                                  So...

                                  This seems to be a hot topic, with little agreement indeed. I can't help
                                  but think of Langdon and Polis book "Foundation of Genetic Programming"
                                  where they state, complete with a lot of mathematic theories about the
                                  searchspace, that GA is actually a subset of a certain kind of GP (with
                                  one-point crossover etc). In this version of GP, the later stages of a
                                  GP-run is actually behaving like a GA-search, according to the authors and
                                  their mathematical proofs.

                                  I think the theories of this book would have quite a lot to say about what
                                  kind of efficiency we can expect GP to have on the problem mentioned here.
                                  On the other hand, this is not the commonly used version of GP (if there is
                                  such a version), and on a third hand (huh), I'm not good enough at
                                  mathemathics to really assess the general validity of the mathematical
                                  proofs in this book. Nor have I had enough time.. Maybe they are all wrong? :)

                                  In any case, it might be a tip to some of you interested in the relation
                                  between GA and GP, since this book explores it quite a lot.

                                  And, correct me if I'm wrong (since I'm not too up to date about current
                                  GA-implementations), but doesn't GA produce a fixed length bit string,
                                  where GP produces a variable length program, that may in turn produce a
                                  fixed length bit string (if that's what you want it to do)? Even though
                                  GA/GP can both be used for min/max-problem-solving, they aren't really "the
                                  same" (depending on how you define "same" of course) since you would have
                                  to execute a program to get the results out of a GP-individual?

                                  Have a nice day!

                                  /Mats Andrén

                                  At 09:11 2003-12-04 +0000, you wrote:
                                  > > And what Candida means (I think) by the difference between parameter and
                                  > > function optimization is for example, when GP
                                  > > is used to generate the equation y = 0.75x + 0.6x^2
                                  > > GP will probably easily find y = ax + bx^2 but will probably not find
                                  > > values for 'a' and 'b' as optimal as GA would, or am I mistaken?
                                  >
                                  >Would be interesting to do a comparison. Is anyone aware of one?
                                  >
                                  >On the other hand, one could always add some noise to the terminal
                                  >parameters similar to GAs with real values. Perhaps one could even
                                  >represent terminals as bit strings and perform GA crossover between the
                                  >terminals that two trees, which undergo GP crossover, have in common.
                                  >
                                  >I guess one could also generate suitable models using GP and then
                                  >optimise its parameters using GA.
                                  >
                                  >I suppose someone must have done something like this. I would be
                                  >grateful for any references. Many thanks in advance.
                                  >
                                  >Regards,
                                  >Christian
                                  >
                                  >
                                  >
                                  >To unsubscribe from this group, send an email to:
                                  >genetic_programming-unsubscribe@yahoogroups.com
                                  >
                                  >
                                  >
                                  >Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
                                • Candida Ferreira
                                  ... I wonder what you mean by this (the commercial part, that is)... Candida Ferreira ... Candida Ferreira, Ph.D. Chief Scientist, Gepsoft 73 Elmtree Drive
                                  Message 16 of 30 , Dec 4, 2003
                                  • 0 Attachment
                                    Martin Sewell wrote:

                                    > > > Does anyone know a problem which GA can solve but GP cannot? (in a
                                    > > > reasonable way..)
                                    > >
                                    > >GAs are good at parameter optimization, a task GP cannot handle, although
                                    > >GEP, more related to GP than GAs, can handle this kind of problem very
                                    > >well thanks to its multigenic nature (see my book <SNIP>).
                                    >
                                    > GP is a subset of, and more expressive than, GAs, though GAs are likely to
                                    > be more efficient for some classes of problems. GEP, in contrast, is
                                    > overtly more commercial.

                                    I wonder what you mean by this (the commercial part, that is)...

                                    Candida Ferreira

                                    -----------------------------------------------------------
                                    Candida Ferreira, Ph.D.
                                    Chief Scientist, Gepsoft
                                    73 Elmtree Drive
                                    Bristol BS13 8NA, UK
                                    ph: +44 (0) 117 330 9272
                                    http://www.gepsoft.com/gepsoft
                                    http://www.gene-expression-programming.com/author.asp
                                    -----------------------------------------------------------
                                  Your message has been successfully submitted and would be delivered to recipients shortly.