Loading ...
Sorry, an error occurred while loading the content.

RE: [GP] NN and GA

Expand Messages
  • Anna I Esparcia Alcazar
    Dear Roger, Toni & Bill, Ken Sharman and myself did some work on evolving the topology and weights of recurrent NNs using GP back in 1995 (you can find the
    Message 1 of 21 , Sep 10, 2003
    • 0 Attachment
      Dear Roger, Toni & Bill,
       
      Ken Sharman and myself did some work on evolving the topology and weights of recurrent NNs using GP back in 1995 (you can find the references in http://www.iti.upv.es/cas/CASpubs.htm) but if, I understand it correctly, the question is about evolving the _inputs_ to the neural network, not the NN itself. Perhaps Roger would like to clarify this?
       
      Anna
       

      -------------------------------------
      Dr Anna I Esparcia-Alcazar
      Institut Tecnologic de Informatica
      Universitat Politecnica de Valencia
      Cami de Vera s/n
      46071 Valencia (Spain)

      Tel. +34 963877069 Fax +34 963877239
      -------------------------------------

      -----Mensaje original-----
      De: Tony Abou-Assaleh [mailto:taa@...]
      Enviado el: miércoles, 10 de septiembre de 2003 7:32
      Para: genetic_programming@yahoogroups.com
      Asunto: Re: [GP] NN and GA

      Hi Roger,

      I have done some work on evolving RNNs. Some of it you can find on my
      website at: http://www.cs.dal.ca/~taa/

      Email me if you have any question about approach or implementation. I can
      make the source code available (AS IS) both in Java and in C.

      Cheers,

      TAA

      --------------------------------------------------
      Tony Abou-Assaleh
      Ph.D. Candidate, Faculty of Computer Science
      Dalhousie University, Halifax, NS, Canada, B3H 1W5
      Fax:   902-492-1517
      Email: taa@...
      WWW:   http://www.cs.dal.ca/~taa/
      ---------------------[THE END]--------------------


      On Tue, 9 Sep 2003, Roger Smith wrote:

      > Does anyone know of or have any recommendations for
      > tutorials outlining GA and evolving inputs to Neural
      > Nets?
      > Thanks
      > Rog
      >


      Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.
    • Candida Ferreira
      There s an entire chapter on the complete induction of neural networks in my book GEP: Mathematical Evolution by an Artificial Intelligence . The detailed
      Message 2 of 21 , Sep 10, 2003
      • 0 Attachment
        There's an entire chapter on the complete induction of neural networks in my book "GEP: Mathematical Evolution by an Artificial Intelligence". The detailed Table of Contents can be found at:

        http://www.gene-expression-programming.com/gep/books/GEPBookTOC.asp

        Candida Ferreira

        -----------------------------------------------------------
        Candida Ferreira, Ph.D.
        Chief Scientist, Gepsoft
        73 Elmtree Drive
        Bristol BS13 8NA, UK
        ph: +44 (0) 117 330 9272
        www.gepsoft.com/gepsoft
        www.gene-expression-programming.com/author.asp
        -----------------------------------------------------------



        ----- Original Message -----
        From: "Roger Smith" <rog_21@...>
        To: <genetic_programming@yahoogroups.com>
        Sent: Tuesday, September 09, 2003 6:46 PM
        Subject: [GP] NN and GA


        > Does anyone know of or have any recommendations for
        > tutorials outlining GA and evolving inputs to Neural
        > Nets?
        > Thanks
        > Rog
        >
        > __________________________________
        > Do you Yahoo!?
        > Yahoo! SiteBuilder - Free, easy-to-use web site design software
        > http://sitebuilder.yahoo.com
        >
        >
        > To unsubscribe from this group, send an email to:
        > genetic_programming-unsubscribe@yahoogroups.com
        >
        >
        >
        > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
        >
        >
        >
      • Roger Smith
        I guess Anna, what I am trying to get at is a way to evolve the inputs to the NN, plug them into the NN, run the NN, and see how well this run of the NN
        Message 3 of 21 , Sep 10, 2003
        • 0 Attachment
          I guess Anna, what I am trying to get at is a way to
          evolve the inputs to the NN, plug them into the NN,
          run the NN, and see how well this run of the NN
          correlated to a set of real world data, then based on
          this correlation, calculate the fitness for the set
          chromosomes and then run the GA again to generate a
          new set of Chromosomes and start the whole process
          over again.
          Does that make sense??


          --Rog

          -- Anna I Esparcia Alcazar <anna@...> wrote:
          > Dear Roger, Toni & Bill,
          >
          > Ken Sharman and myself did some work on evolving the
          > topology and weights of
          > recurrent NNs using GP back in 1995 (you can find
          > the references in
          > http://www.iti.upv.es/cas/CASpubs.htm) but if, I
          > understand it correctly,
          > the question is about evolving the _inputs_ to the
          > neural network, not the
          > NN itself. Perhaps Roger would like to clarify this?
          >
          > Anna
          >
          > -------------------------------------
          > Dr Anna I Esparcia-Alcazar
          > Institut Tecnologic de Informatica
          > Universitat Politecnica de Valencia
          > Cami de Vera s/n
          > 46071 Valencia (Spain)
          >
          > Tel. +34 963877069 Fax +34 963877239
          > -------------------------------------
          >
          > -----Mensaje original-----
          > De: Tony Abou-Assaleh [mailto:taa@...]
          > Enviado el: miercoles, 10 de septiembre de 2003
          > 7:32
          > Para: genetic_programming@yahoogroups.com
          > Asunto: Re: [GP] NN and GA
          >
          >
          > Hi Roger,
          >
          > I have done some work on evolving RNNs. Some of it
          > you can find on my
          > website at: http://www.cs.dal.ca/~taa/
          >
          > Email me if you have any question about approach
          > or implementation. I can
          > make the source code available (AS IS) both in
          > Java and in C.
          >
          > Cheers,
          >
          > TAA
          >
          > --------------------------------------------------
          > Tony Abou-Assaleh
          > Ph.D. Candidate, Faculty of Computer Science
          > Dalhousie University, Halifax, NS, Canada, B3H 1W5
          > Fax: 902-492-1517
          > Email: taa@...
          > WWW: http://www.cs.dal.ca/~taa/
          > ---------------------[THE END]--------------------
          >
          >
          > On Tue, 9 Sep 2003, Roger Smith wrote:
          >
          > > Does anyone know of or have any recommendations
          > for
          > > tutorials outlining GA and evolving inputs to
          > Neural
          > > Nets?
          > > Thanks
          > > Rog
          > >
          >
          > Your use of Yahoo! Groups is subject to the Yahoo!
          > Terms of Service.
          >


          __________________________________
          Do you Yahoo!?
          Yahoo! SiteBuilder - Free, easy-to-use web site design software
          http://sitebuilder.yahoo.com
        • C. Setzkorn
          Hi Roger, Do you want to deploy GP for feature selection? If so, I guess some work has been done there already. @article{ yang97feature, author = Jihoon Yang
          Message 4 of 21 , Sep 10, 2003
          • 0 Attachment
            Hi Roger,

            Do you want to deploy GP for feature selection? If so, I guess some work
            has been done there already.

            @article{ yang97feature,
            author = "Jihoon Yang and Vasant Honavar",
            title = "Feature Subset Selection Using a Genetic Algorithm",
            journal = "IEEE Intelligent Systems",
            volume = "13",
            pages = "44-49",
            year = "1998",
            url = "citeseer.nj.nec.com/yang98feature.html" }

            Another interesting thing would be to let GP do 'feature extraction'
            (combining different features (e.g. linearly) to form new features which
            may improve performance).

            @inproceedings{ sherrah97evolutionary,
            author = "Jamie R. Sherrah and Robert E. Bogner and Abdesselam
            Bouzerdoum",
            title = "The Evolutionary Pre-Processor: Automatic Feature
            Extraction for Supervised Classification using Genetic Programming",
            booktitle = "Genetic Programming 1997: Proceedings of the Second
            Annual Conference",
            month = "13-16",
            publisher = "Morgan Kaufmann",
            address = "Stanford University, CA, USA",
            editor = "John R. Koza and Kalyanmoy Deb and Marco Dorigo and David
            B. Fogel and Max Garzon and Hitoshi Iba and Rick L. Riolo",
            pages = "304--312",
            year = "1997",
            url = "citeseer.nj.nec.com/sherrah97evolutionary.html" }

            HTH,
            Chris


            Roger Smith wrote:
            > I guess Anna, what I am trying to get at is a way to
            > evolve the inputs to the NN, plug them into the NN,
            > run the NN, and see how well this run of the NN
            > correlated to a set of real world data, then based on
            > this correlation, calculate the fitness for the set
            > chromosomes and then run the GA again to generate a
            > new set of Chromosomes and start the whole process
            > over again.
            > Does that make sense??
            >
            >
            > --Rog
            >
            > -- Anna I Esparcia Alcazar <anna@...> wrote:
            >
            >>Dear Roger, Toni & Bill,
            >>
            >>Ken Sharman and myself did some work on evolving the
            >>topology and weights of
            >>recurrent NNs using GP back in 1995 (you can find
            >>the references in
            >>http://www.iti.upv.es/cas/CASpubs.htm) but if, I
            >>understand it correctly,
            >>the question is about evolving the _inputs_ to the
            >>neural network, not the
            >>NN itself. Perhaps Roger would like to clarify this?
            >>
            >>Anna
            >>
            >>-------------------------------------
            >>Dr Anna I Esparcia-Alcazar
            >>Institut Tecnologic de Informatica
            >>Universitat Politecnica de Valencia
            >>Cami de Vera s/n
            >>46071 Valencia (Spain)
            >>
            >>Tel. +34 963877069 Fax +34 963877239
            >>-------------------------------------
            >>
            >> -----Mensaje original-----
            >> De: Tony Abou-Assaleh [mailto:taa@...]
            >> Enviado el: miercoles, 10 de septiembre de 2003
            >>7:32
            >> Para: genetic_programming@yahoogroups.com
            >> Asunto: Re: [GP] NN and GA
            >>
            >>
            >> Hi Roger,
            >>
            >> I have done some work on evolving RNNs. Some of it
            >>you can find on my
            >> website at: http://www.cs.dal.ca/~taa/
            >>
            >> Email me if you have any question about approach
            >>or implementation. I can
            >> make the source code available (AS IS) both in
            >>Java and in C.
            >>
            >> Cheers,
            >>
            >> TAA
            >>
            >> --------------------------------------------------
            >> Tony Abou-Assaleh
            >> Ph.D. Candidate, Faculty of Computer Science
            >> Dalhousie University, Halifax, NS, Canada, B3H 1W5
            >> Fax: 902-492-1517
            >> Email: taa@...
            >> WWW: http://www.cs.dal.ca/~taa/
            >> ---------------------[THE END]--------------------
            >>
            >>
            >> On Tue, 9 Sep 2003, Roger Smith wrote:
            >>
            >> > Does anyone know of or have any recommendations
            >>for
            >> > tutorials outlining GA and evolving inputs to
            >>Neural
            >> > Nets?
            >> > Thanks
            >> > Rog
          • Christian Gagne
            Hi Roger and everyone, We did some work here on the construction of features with GP, to use them as inputs of NNs, in an handwriting character recognition
            Message 5 of 21 , Sep 10, 2003
            • 0 Attachment
              Hi Roger and everyone,

              We did some work here on the construction of features with GP, to use
              them as inputs of NNs, in an handwriting character recognition context.

              http://vision.gel.ulaval.ca/en/publications/Id_40/PublDetails.php

              @inproceedings{Lemieux40,
              author = { Alexandre Lemieux and Christian Gagné and Marc Parizeau },
              title = { Genetical Engineering of Handwriting Representations },
              booktitle = { Proc. of the International Workshop on Frontiers in Handwriting Recognition (IWFHR) },
              publisher = { IEEE Computer Press },
              year = { 2002 },
              month = { August 6-8 },
              location = { Niagara-On-The-Lake }
              }

              This was a first experimentation, we are actually working on the project
              using some specialized representation for features construction.

              Peoples in Montreal are also working on this kind of problematic, using
              GA for features selection of inputs of NNs.

              @InProceedings{Oliveira02a,
              author = "L. S. Oliveira and R. Sabourin and F. Bortolozzi and
              C. Y. Suen",
              title = "Feature {S}election {U}sing {M}ulti-{O}bjective
              {G}enetic {A}lgorithms for {H}andwritten {D}igit
              {R}ecognition",
              booktitle = "Proceedings of the 16th International Conference on
              Pattern Recognition (ICPR'2002)",
              volume = "1",
              pages = "568--571",
              address = "Quebec City, Canada",
              publisher = "IEEE Computer Society Press",
              month = aug,
              year = "2002",
              }

              christian


              On Tue, 2003-09-09 at 13:46, Roger Smith wrote:
              > Does anyone know of or have any recommendations for
              > tutorials outlining GA and evolving inputs to Neural
              > Nets?
              > Thanks
              > Rog
              >
              > __________________________________
              > Do you Yahoo!?
              > Yahoo! SiteBuilder - Free, easy-to-use web site design software
              > http://sitebuilder.yahoo.com
              >
              >
              > To unsubscribe from this group, send an email to:
              > genetic_programming-unsubscribe@yahoogroups.com
              >
              >
              >
              > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
              --

              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
              ~> Christian Gagné
              ~> http://www.gel.ulaval.ca/~cgagne
            • Loganathan.G
              HI Rog and Everybody I hope the attached paper will answer your querry. G.Loganathan ________________________________________________________________________
              Message 6 of 21 , Sep 10, 2003
              • 0 Attachment
                HI Rog and Everybody

                I hope the attached paper will answer your querry.

                G.Loganathan


                ________________________________________________________________________
                Want to chat instantly with your online friends? Get the FREE Yahoo!
                Messenger http://mail.messenger.yahoo.co.uk
              • Hu,Jianjun
                As far as I know, one of the best work of Evolving NN is Stanley s NEAT algorithm. You could check out paper and source code here.
                Message 7 of 21 , Sep 11, 2003
                • 0 Attachment
                  As far as I know, one of the best work of Evolving NN is
                  Stanley's NEAT algorithm. You could check out paper and source code
                  here.

                  http://www.cs.utexas.edu/users/kstanley/neat.html

                  Jianjun Hu (George)
                  Genetic Algorithm Research & Application Group (GARAGe)
                  Department of Computer Science & Engineering
                  Michigan State University
                  hujianju@...
                  Phone: 517-355-3796(o)

                  -----Original Message-----
                  From: genetic_programming@yahoogroups.com
                  [mailto:genetic_programming@yahoogroups.com]
                  Sent: Thursday, September 11, 2003 8:42 AM
                  To: genetic_programming@yahoogroups.com
                  Subject: [GP] Digest Number 496



                  To unsubscribe from this group, send an email to:
                  genetic_programming-unsubscribe@yahoogroups.com


                  ------------------------------------------------------------------------

                  There are 4 messages in this issue.

                  Topics in this digest:

                  1. RE: NN and GA
                  From: "Loganathan.G" <guru_logu@...>
                  2. GA and VC++
                  From: "mhassan_hamasa" <mhassan_hamasa@...>
                  3. Re: VC++
                  From: "mhassan_hamasa" <mhassan_hamasa@...>
                  4. Re: Re: VC++
                  From: Christian Gagne <cgagne@...>


                  ________________________________________________________________________
                  ________________________________________________________________________

                  Message: 1
                  Date: Thu, 11 Sep 2003 06:58:42 +0100 (BST)
                  From: "Loganathan.G" <guru_logu@...>
                  Subject: RE: NN and GA

                  HI Rog and Everybody

                  I hope the attached paper will answer your querry.

                  G.Loganathan


                  ________________________________________________________________________
                  Want to chat instantly with your online friends? Get the FREE Yahoo!
                  Messenger http://mail.messenger.yahoo.co.uk

                  [This message contained attachments]



                  ________________________________________________________________________
                  ________________________________________________________________________

                  Message: 2
                  Date: Thu, 11 Sep 2003 07:57:44 -0000
                  From: "mhassan_hamasa" <mhassan_hamasa@...>
                  Subject: GA and VC++

                  Hi every one,

                  I want to use GA in assigment in my collage,
                  i want to make a VC++ program use the GA, so if any one has a GA in
                  C== form please contact me as sonn as possible.


                  bye



                  ________________________________________________________________________
                  ________________________________________________________________________

                  Message: 3
                  Date: Thu, 11 Sep 2003 08:46:58 -0000
                  From: "mhassan_hamasa" <mhassan_hamasa@...>
                  Subject: Re: VC++

                  Hi,

                  Thanks alot for your help,
                  Thanks Marc, I try the first site i download eo-0.9.2.gz but i don't
                  know how to use the extracted file.
                  Thanks alot Iqbal, i try GALib but its give my linking error in VC if
                  you use it can you help me to solve these errors...

                  Thanks alot Roger, I think its depend on what you will do, i want to
                  use GA in Cryptanalysis in my thesis. if your code will be generic
                  and i can use the GA part only it will be nice


                  Thanks again for your help,

                  any way what i want is really easy, i need only something like C++
                  class implemnt the GA algorithm. i need a generic implementation of
                  GA not a class to solve certain problem like sells man problem.

                  Again if any one can help me please do it

                  bye
                  MHassan



                  ________________________________________________________________________
                  ________________________________________________________________________

                  Message: 4
                  Date: 11 Sep 2003 08:23:09 -0400
                  From: Christian Gagne <cgagne@...>
                  Subject: Re: Re: VC++

                  Hi MHassan,

                  Try Open BEAGLE. On the GA side, it actually only implements a simple
                  bit string GA. But is very generic and can be easily tuned for your
                  specific needs.

                  http://www.gel.ulaval.ca/~beagle

                  A generic linear genome framework will be developed this Fall. It will
                  include instanciations for most well-known linear implementation (bit
                  strings, real-valued vectors, ES, messy GA).

                  christian


                  On Thu, 2003-09-11 at 04:46, mhassan_hamasa wrote:
                  > Hi,
                  >
                  > Thanks alot for your help,
                  > Thanks Marc, I try the first site i download eo-0.9.2.gz but i don't
                  > know how to use the extracted file.
                  > Thanks alot Iqbal, i try GALib but its give my linking error in VC if
                  > you use it can you help me to solve these errors...
                  >
                  > Thanks alot Roger, I think its depend on what you will do, i want to
                  > use GA in Cryptanalysis in my thesis. if your code will be generic
                  > and i can use the GA part only it will be nice
                  >
                  >
                  > Thanks again for your help,
                  >
                  > any way what i want is really easy, i need only something like C++
                  > class implemnt the GA algorithm. i need a generic implementation of
                  > GA not a class to solve certain problem like sells man problem.
                  >
                  > Again if any one can help me please do it
                  >
                  > bye
                  > MHassan
                  >
                  >
                  >
                  > To unsubscribe from this group, send an email to:
                  > genetic_programming-unsubscribe@yahoogroups.com
                  >
                  >
                  >
                  > Your use of Yahoo! Groups is subject to
                  > http://docs.yahoo.com/info/terms/
                  --

                  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                  ~> Christian Gagné
                  ~> http://www.gel.ulaval.ca/~cgagne



                  ________________________________________________________________________
                  ________________________________________________________________________



                  Your use of Yahoo! Groups is subject to
                  http://docs.yahoo.com/info/terms/
                • dreamwafer
                  ... wrote: One such configuration ... there ... better ... Taking this argument to an extreme, I could say that the word processor I use is
                  Message 8 of 21 , Sep 11, 2003
                  • 0 Attachment
                    --- In genetic_programming@yahoogroups.com, "graamonexxx"
                    <graamone@h...> wrote:
                    One such configuration
                    > Is it then fair to say that MEP
                    > is a seperate 'genre' of evolutionary computing or that its only a
                    > specific GP configuration? Is there data available showing that
                    there
                    > is something inherit in the MEP algorithm that makes it pefor
                    better
                    > than a GP system configured as described on the the same problem?

                    Taking this argument to an extreme, I could say that the word
                    processor I use is "just a configuration of a c++ program".
                    Furthermore, just because a tool makes something easy to implement,
                    doesn't necessarily devalue the result. That said, I'm highly
                    skeptical of claims asserting that some variant of GA is universally
                    better for all (or even most) problems. IMO, it's better to match
                    the technique to the problem as the NFL theorum suggests.

                    -- Greg Schmidt
                  • Douglas B. Kell
                    2 questions. 1. Can someone in this thread tell me why, given that you can do standard tree-based GP in which you evolve both topology/architecture AND nodal
                    Message 9 of 21 , Sep 11, 2003
                    • 0 Attachment
                      2 questions.

                      1. Can someone in this thread tell me why, given that you can do
                      'standard' tree-based GP in which you evolve both
                      topology/architecture AND nodal functions to optimise some
                      output(s), why you would ever want to constrain yourself to the type
                      of squashing function (often tanh) typically used in backprop nets?

                      2. Hal White and others have shown that an arbitrarily sized NN with
                      nonlinear continuously available squashing functions and 1 hidden
                      layer can effect an arbitrary mapping from inputs to outputs. Has a
                      similar 'proof' been made for GP and with what constraints?

                      As an aside, we have often found problems that linearise well when
                      you use only a subset of the input variables which can still solve the
                      problem. By not including ALL of the input variables - many of which
                      contribute only noise - apparent or at least UNNECESSARY
                      nonlinearities are removed. NNs are poor at setting the weights to
                      0.0.....

                      All the best,
                      Douglas.

                      > As far as I know, one of the best work of Evolving NN is
                      > Stanley's NEAT algorithm. You could check out paper and source code
                      > here.
                      >
                      > http://www.cs.utexas.edu/users/kstanley/neat.html
                      >
                      > Jianjun Hu (George)
                      > Genetic Algorithm Research & Application Group (GARAGe)
                      > Department of Computer Science & Engineering
                      > Michigan State University
                      > hujianju@...
                      > Phone: 517-355-3796(o)
                      >
                      > -----Original Message-----
                      > From: genetic_programming@yahoogroups.com
                      > [mailto:genetic_programming@yahoogroups.com]
                      > Sent: Thursday, September 11, 2003 8:42 AM
                      > To: genetic_programming@yahoogroups.com
                      > Subject: [GP] Digest Number 496
                      >
                      >
                      >
                      > To unsubscribe from this group, send an email to:
                      > genetic_programming-unsubscribe@yahoogroups.com
                      >
                      >
                      > ----------------------------------------------------------------------
                      > --
                      >
                      > There are 4 messages in this issue.
                      >
                      > Topics in this digest:
                      >
                      > 1. RE: NN and GA
                      > From: "Loganathan.G" <guru_logu@...>
                      > 2. GA and VC++
                      > From: "mhassan_hamasa" <mhassan_hamasa@...>
                      > 3. Re: VC++
                      > From: "mhassan_hamasa" <mhassan_hamasa@...>
                      > 4. Re: Re: VC++
                      > From: Christian Gagne <cgagne@...>
                      >
                      >
                      > ______________________________________________________________________
                      > __
                      > ______________________________________________________________________
                      > __
                      >
                      > Message: 1
                      > Date: Thu, 11 Sep 2003 06:58:42 +0100 (BST)
                      > From: "Loganathan.G" <guru_logu@...>
                      > Subject: RE: NN and GA
                      >
                      > HI Rog and Everybody
                      >
                      > I hope the attached paper will answer your querry.
                      >
                      > G.Loganathan
                      >
                      >
                      > ______________________________________________________________________
                      > __ Want to chat instantly with your online friends? Get the FREE
                      > Yahoo! Messenger http://mail.messenger.yahoo.co.uk
                      >
                      > [This message contained attachments]
                      >
                      >
                      >
                      > ______________________________________________________________________
                      > __
                      > ______________________________________________________________________
                      > __
                      >
                      > Message: 2
                      > Date: Thu, 11 Sep 2003 07:57:44 -0000
                      > From: "mhassan_hamasa" <mhassan_hamasa@...>
                      > Subject: GA and VC++
                      >
                      > Hi every one,
                      >
                      > I want to use GA in assigment in my collage,
                      > i want to make a VC++ program use the GA, so if any one has a GA in
                      > C== form please contact me as sonn as possible.
                      >
                      >
                      > bye
                      >
                      >
                      >
                      > ______________________________________________________________________
                      > __
                      > ______________________________________________________________________
                      > __
                      >
                      > Message: 3
                      > Date: Thu, 11 Sep 2003 08:46:58 -0000
                      > From: "mhassan_hamasa" <mhassan_hamasa@...>
                      > Subject: Re: VC++
                      >
                      > Hi,
                      >
                      > Thanks alot for your help,
                      > Thanks Marc, I try the first site i download eo-0.9.2.gz but i don't
                      > know how to use the extracted file. Thanks alot Iqbal, i try GALib but
                      > its give my linking error in VC if you use it can you help me to solve
                      > these errors...
                      >
                      > Thanks alot Roger, I think its depend on what you will do, i want to
                      > use GA in Cryptanalysis in my thesis. if your code will be generic and
                      > i can use the GA part only it will be nice
                      >
                      >
                      > Thanks again for your help,
                      >
                      > any way what i want is really easy, i need only something like C++
                      > class implemnt the GA algorithm. i need a generic implementation of GA
                      > not a class to solve certain problem like sells man problem.
                      >
                      > Again if any one can help me please do it
                      >
                      > bye
                      > MHassan
                      >
                      >
                      >
                      > ______________________________________________________________________
                      > __
                      > ______________________________________________________________________
                      > __
                      >
                      > Message: 4
                      > Date: 11 Sep 2003 08:23:09 -0400
                      > From: Christian Gagne <cgagne@...>
                      > Subject: Re: Re: VC++
                      >
                      > Hi MHassan,
                      >
                      > Try Open BEAGLE. On the GA side, it actually only implements a simple
                      > bit string GA. But is very generic and can be easily tuned for your
                      > specific needs.
                      >
                      > http://www.gel.ulaval.ca/~beagle
                      >
                      > A generic linear genome framework will be developed this Fall. It will
                      > include instanciations for most well-known linear implementation (bit
                      > strings, real-valued vectors, ES, messy GA).
                      >
                      > christian
                      >
                      >
                      > On Thu, 2003-09-11 at 04:46, mhassan_hamasa wrote:
                      > > Hi,
                      > >
                      > > Thanks alot for your help,
                      > > Thanks Marc, I try the first site i download eo-0.9.2.gz but i don't
                      > > know how to use the extracted file. Thanks alot Iqbal, i try GALib
                      > > but its give my linking error in VC if you use it can you help me to
                      > > solve these errors...
                      > >
                      > > Thanks alot Roger, I think its depend on what you will do, i want to
                      > > use GA in Cryptanalysis in my thesis. if your code will be generic
                      > > and i can use the GA part only it will be nice
                      > >
                      > >
                      > > Thanks again for your help,
                      > >
                      > > any way what i want is really easy, i need only something like C++
                      > > class implemnt the GA algorithm. i need a generic implementation of
                      > > GA not a class to solve certain problem like sells man problem.
                      > >
                      > > Again if any one can help me please do it
                      > >
                      > > bye
                      > > MHassan
                      > >
                      > >
                      > >
                      > > To unsubscribe from this group, send an email to:
                      > > genetic_programming-unsubscribe@yahoogroups.com
                      > >
                      > >
                      > >
                      > > Your use of Yahoo! Groups is subject to
                      > > http://docs.yahoo.com/info/terms/
                      > --
                      >
                      > ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
                      > ~> Christian Gagné
                      > ~> http://www.gel.ulaval.ca/~cgagne
                      >
                      >
                      >
                      > ______________________________________________________________________
                      > __
                      > ______________________________________________________________________
                      > __
                      >
                      >
                      >
                      > Your use of Yahoo! Groups is subject to
                      > http://docs.yahoo.com/info/terms/
                      >
                      >
                      >
                      > ------------------------ Yahoo! Groups Sponsor
                      > ---------------------~--> Buy Ink Cartridges or Refill Kits for Your
                      > HP, Epson, Canon or Lexmark Printer at Myinks.com. Free s/h on orders
                      > $50 or more to the US & Canada.
                      > http://www.c1tracking.com/l.asp?cid=5511
                      > http://us.click.yahoo.com/l.m7sD/LIdGAA/qnsNAA/7brrlB/TM
                      > ---------------------------------------------------------------------~
                      > ->
                      >
                      > To unsubscribe from this group, send an email to:
                      > genetic_programming-unsubscribe@yahoogroups.com
                      >
                      >
                      >
                      > Your use of Yahoo! Groups is subject to
                      > http://docs.yahoo.com/info/terms/
                      >
                      >

                      --
                      Douglas Kell
                      EPSRC/RSC Chair in Bioanalytical Science
                      Dept Chemistry, Faraday Building , UMIST, PO Box 88, Sackville St,
                      MANCHESTER M60 1QD
                      Tel: 0161 200 4492 Fax 0161 200 4556 dbk@...
                      old website http://qbab.aber.ac.uk
                    • Bill White
                      Our software evolves all the NN features: variable selection, topology and weights. It is based on Koza s work, but we added variable selection. Koza, John R.,
                      Message 10 of 21 , Sep 11, 2003
                      • 0 Attachment
                        Our software evolves all the NN features: variable selection, topology and weights. It is based on Koza's work, but we added variable selection.

                        Koza, John R., and Rice, James P. 1991b. Genetic generation of both the weights and architecture for a neural network. In Proceedings of International Joint Conference on Neural Networks, Seattle, July 1991. Los Alamitos, CA: IEEE Press. Volume II. Pages 397-404. This paper shows how to find both the weights and architecture for a neural network (including the number of layers, the number of processing elements per layer, and the connectivity between processing elements). This is accomplished using a recently developed extension to the genetic algorithm which genetically breeds a population of LISP symbolic expressions (S-expressions) of varying size and shape until the desired performance by the network is successfully evolved. The new "genetic programming" paradigm is applied to the problem of generating a neural network for the one-bit adder.

                        Again our papaers are here that choose genes amongst many noisy ones to predict disease status:

                        http://www.cs.bham.ac.uk/~wbl/biblio/gp-html/MarylynDRitchie.html
                        
                        and
                        
                        http://www.pubmedcentral.nih.gov/articlerender.fcgi?rendertype=abstract&artid=183838



                        Anna I Esparcia Alcazar wrote:
                        Dear Roger, Toni & Bill,
                         
                        Ken Sharman and myself did some work on evolving the topology and weights of recurrent NNs using GP back in 1995 (you can find the references in http://www.iti.upv.es/cas/CASpubs.htm) but if, I understand it correctly, the question is about evolving the _inputs_ to the neural network, not the NN itself. Perhaps Roger would like to clarify this?
                         
                        Anna
                         

                        -------------------------------------
                        Dr Anna I Esparcia-Alcazar
                        Institut Tecnologic de Informatica
                        Universitat Politecnica de Valencia
                        Cami de Vera s/n
                        46071 Valencia (Spain)

                        Tel. +34 963877069 Fax +34 963877239
                        -------------------------------------

                        -----Mensaje original-----
                        De: Tony Abou-Assaleh [mailto:taa@...]
                        Enviado el: miércoles, 10 de septiembre de 2003 7:32
                        Para: genetic_programming@yahoogroups.com
                        Asunto: Re: [GP] NN and GA

                        Hi Roger,

                        I have done some work on evolving RNNs. Some of it you can find on my
                        website at: http://www.cs.dal.ca/~taa/

                        Email me if you have any question about approach or implementation. I can
                        make the source code available (AS IS) both in Java and in C.

                        Cheers,

                        TAA

                        --------------------------------------------------
                        Tony Abou-Assaleh
                        Ph.D. Candidate, Faculty of Computer Science
                        Dalhousie University, Halifax, NS, Canada, B3H 1W5
                        Fax:   902-492-1517
                        Email: taa@...
                        WWW:   http://www.cs.dal.ca/~taa/
                        ---------------------[THE END]--------------------


                        On Tue, 9 Sep 2003, Roger Smith wrote:

                        > Does anyone know of or have any recommendations for
                        > tutorials outlining GA and evolving inputs to Neural
                        > Nets?
                        > Thanks
                        > Rog
                        >


                        Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.


                        To unsubscribe from this group, send an email to:
                        genetic_programming-unsubscribe@yahoogroups.com



                        Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.


                        -- 
                        Nature teaches more than she preaches. There are no sermons in
                        stones. It is easier to get a spark out of a stone than a moral. 
                        -John Burroughs, naturalist and writer (1837-1921)
                        
                      • Howard Landman
                        ... Another very powerful approach to this is Frederic Gruau s cellular encoding technique. Basically he evolves a GP-like genotype program which, when
                        Message 11 of 21 , Sep 11, 2003
                        • 0 Attachment
                          Bill White wrote:
                          Our software evolves all the NN features: variable selection, topology and weights. It is based on Koza's work, but we added variable selection.

                          Koza, John R., and Rice, James P. 1991b. Genetic generation of both the weights and architecture for a neural network. In Proceedings of International Joint Conference on Neural Networks, Seattle, July 1991. Los Alamitos, CA: IEEE Press. Volume II. Pages 397-404.
                          Another very powerful approach to this is Frederic Gruau's "cellular encoding" technique.  Basically he evolves a GP-like genotype program which, when executed on a simple one-neuron graph, modifies it into a complete NN phenotype.  It is possible to give an arbitrary integer n as an input to the program, and thus produce a family of solutions of different sizes.  Gruau was able to evolve a perfect solution to the n-bit parity problem for all n by simply evaluating each individual on a few different sizes.  See for example Gruau, Whitley, & Pyeatt, "A Comparison between Cellular Encoding and Direct Encoding for Genetic Neural Networks", Genetic Programming 1996, p.81.

                          I haven't read the Koza-Rice paper so I can't comment on any similarities or differences.

                              Howard A. Landman
                        • John Koza
                          Hello All: Frederic Gruau s technique is clearly the elegant way to do this. Our 1991 paper at the neural network conference was a way to address the
                          Message 12 of 21 , Sep 12, 2003
                          • 0 Attachment
                            Hello All:
                             
                            Frederic Gruau's technique is clearly the elegant way to do this. 
                             
                            Our 1991 paper at the neural network conference was a way to address the connectivity within programs and was handled in a way (called "define building blocks") that is a precursor to the idea of an automatically defined function (ADF, subroutine).
                             

                            John R. Koza

                            Consulting Professor
                            Biomedical Informatics
                            Department of Medicine
                            Medical School Office Building (MC 5479)
                            Stanford University
                            Stanford, California 94305-5479

                            Consulting Professor
                            Department of Electrical Engineering
                            School of Engineering
                            Stanford University

                            PREFERRED MAILING ADDRESS:
                            Post Office Box K
                            Los Altos, CA 94023-4011 USA

                            Phone: 650-941-0336
                            Fax: 650-941-9430
                            E-Mail: koza@...
                            WWW Home Page: http://www.smi.stanford.edu/people/koza

                            For information about field of genetic programming in general:
                            http://www.genetic-programming.org

                            For information about Genetic Programming Inc.:
                            http://www.genetic-programming.com

                            For information about the 2003 book "Genetic Programming IV: Routine Human-Competitive Machine Intelligence", visit http://www.genetic-programming.org/gpbook4toc.html

                            For the genetic programming bibliography, visit http://liinwww.ira.uka.de/bibliography/Ai/genetic.programming.html

                            For information about the Genetic Programming book series from Kluwer Academic Publishers, visit http://www.genetic-programming.org/gpkluwer.html

                            For information about the annual Genetic and Evolutionary Computation Conference (GECCO) (which includes the annual Genetic Programming Conference) to be held in Seattle on June 26-30, 2004 (Saturday - Wednesday) and the International Society on Genetic and Evolutionary Computation that operates the conference, visit: http://www.isgec.org/

                            For information about the annual Euro-Genetic-Programming Conference to be held in April 5-7, 2004 (Monday-Wednesday) at the University of Coimbra in Coimbra Portugal, visit http://www.evonet.info/eurogp2004/

                            For information about the annual NASA/DoD Conference on Evolvable Hardware in Seattle on June 24-26 (Thursday - Saturday), 2004, visit http://ehw.jpl.nasa.gov/events/nasaeh04/

                             
                            -----Original Message-----
                            From: Howard Landman [mailto:howard@...]
                            Sent: Thursday, September 11, 2003 4:19 PM
                            To: genetic_programming@yahoogroups.com
                            Subject: Re: [GP] NN and GA

                            Bill White wrote:
                            Our software evolves all the NN features: variable selection, topology and weights. It is based on Koza's work, but we added variable selection.

                            Koza, John R., and Rice, James P. 1991b. Genetic generation of both the weights and architecture for a neural network. In Proceedings of International Joint Conference on Neural Networks, Seattle, July 1991. Los Alamitos, CA: IEEE Press. Volume II. Pages 397-404.
                            Another very powerful approach to this is Frederic Gruau's "cellular encoding" technique.  Basically he evolves a GP-like genotype program which, when executed on a simple one-neuron graph, modifies it into a complete NN phenotype.  It is possible to give an arbitrary integer n as an input to the program, and thus produce a family of solutions of different sizes.  Gruau was able to evolve a perfect solution to the n-bit parity problem for all n by simply evaluating each individual on a few different sizes.  See for example Gruau, Whitley, & Pyeatt, "A Comparison between Cellular Encoding and Direct Encoding for Genetic Neural Networks", Genetic Programming 1996, p.81.

                            I haven't read the Koza-Rice paper so I can't comment on any similarities or differences.

                                Howard A. Landman


                            To unsubscribe from this group, send an email to:
                            genetic_programming-unsubscribe@yahoogroups.com



                            Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.
                          • Lucas, Simon M
                            Hi Everyone, In my view the area of evolving complex structures such as neural networks is very interesting, and while some methods (such as Gruau s cellular
                            Message 13 of 21 , Sep 13, 2003
                            • 0 Attachment
                              Message
                               
                               Hi Everyone,
                               
                               In my view the area of evolving complex
                               structures such as neural networks is very
                               interesting, and while some methods (such
                               as Gruau's cellular encoding) are elegant,
                               it is often not very clear that they give any
                               benefit over a simple direct encoding of the
                               neural network graph.  Kitano's matrix
                               rewriting method was an indirect technique
                               that had been claimed to outperform direct
                               encoding, but our 1998 paper below showed
                               that you could make direct encoding perform
                               just as well (on the same benchmarks) if you
                               configured it properly:
                               

                              Abdul A. Siddiqi and Simon M. Lucas , A comparison of matrix rewriting versus direct encoding for evolving neural networks, Proceedings of IEEE International Conference on Evolutionary Computation (1998) , pages: 392 -- 397

                               
                               Indirect methods bias the search
                               over the set of possible neural architectures in 
                               some way, but it is interesting to investigate whether
                               this is truly beneficial, if so why, and when does it
                               make matters worse. 
                               
                               Best regards,
                               
                                 Simon Lucas
                              -----Original Message-----
                              From: John Koza [mailto:john@...]
                              Sent: 12 September 2003 20:08
                              To: genetic_programming@yahoogroups.com
                              Subject: RE: [GP] NN and GA

                              Hello All:
                               
                              Frederic Gruau's technique is clearly the elegant way to do this. 
                               
                              Our 1991 paper at the neural network conference was a way to address the connectivity within programs and was handled in a way (called "define building blocks") that is a precursor to the idea of an automatically defined function (ADF, subroutine).
                               

                              John R. Koza

                              Consulting Professor
                              Biomedical Informatics
                              Department of Medicine
                              Medical School Office Building (MC 5479)
                              Stanford University
                              Stanford, California 94305-5479

                              Consulting Professor
                              Department of Electrical Engineering
                              School of Engineering
                              Stanford University

                              PREFERRED MAILING ADDRESS:
                              Post Office Box K
                              Los Altos, CA 94023-4011 USA

                              Phone: 650-941-0336
                              Fax: 650-941-9430
                              E-Mail: koza@...
                              WWW Home Page: http://www.smi.stanford.edu/people/koza

                              For information about field of genetic programming in general:
                              http://www.genetic-programming.org

                              For information about Genetic Programming Inc.:
                              http://www.genetic-programming.com

                              For information about the 2003 book "Genetic Programming IV: Routine Human-Competitive Machine Intelligence", visit http://www.genetic-programming.org/gpbook4toc.html

                              For the genetic programming bibliography, visit http://liinwww.ira.uka.de/bibliography/Ai/genetic.programming.html

                              For information about the Genetic Programming book series from Kluwer Academic Publishers, visit http://www.genetic-programming.org/gpkluwer.html

                              For information about the annual Genetic and Evolutionary Computation Conference (GECCO) (which includes the annual Genetic Programming Conference) to be held in Seattle on June 26-30, 2004 (Saturday - Wednesday) and the International Society on Genetic and Evolutionary Computation that operates the conference, visit: http://www.isgec.org/

                              For information about the annual Euro-Genetic-Programming Conference to be held in April 5-7, 2004 (Monday-Wednesday) at the University of Coimbra in Coimbra Portugal, visit http://www.evonet.info/eurogp2004/

                              For information about the annual NASA/DoD Conference on Evolvable Hardware in Seattle on June 24-26 (Thursday - Saturday), 2004, visit http://ehw.jpl.nasa.gov/events/nasaeh04/

                               
                              -----Original Message-----
                              From: Howard Landman [mailto:howard@...]
                              Sent: Thursday, September 11, 2003 4:19 PM
                              To: genetic_programming@yahoogroups.com
                              Subject: Re: [GP] NN and GA

                              Bill White wrote:
                              Our software evolves all the NN features: variable selection, topology and weights. It is based on Koza's work, but we added variable selection.

                              Koza, John R., and Rice, James P. 1991b. Genetic generation of both the weights and architecture for a neural network. In Proceedings of International Joint Conference on Neural Networks, Seattle, July 1991. Los Alamitos, CA: IEEE Press. Volume II. Pages 397-404.
                              Another very powerful approach to this is Frederic Gruau's "cellular encoding" technique.  Basically he evolves a GP-like genotype program which, when executed on a simple one-neuron graph, modifies it into a complete NN phenotype.  It is possible to give an arbitrary integer n as an input to the program, and thus produce a family of solutions of different sizes.  Gruau was able to evolve a perfect solution to the n-bit parity problem for all n by simply evaluating each individual on a few different sizes.  See for example Gruau, Whitley, & Pyeatt, "A Comparison between Cellular Encoding and Direct Encoding for Genetic Neural Networks", Genetic Programming 1996, p.81.

                              I haven't read the Koza-Rice paper so I can't comment on any similarities or differences.

                                  Howard A. Landman


                              To unsubscribe from this group, send an email to:
                              genetic_programming-unsubscribe@yahoogroups.com



                              Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.


                              To unsubscribe from this group, send an email to:
                              genetic_programming-unsubscribe@yahoogroups.com



                              Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.
                            • Roger Smith
                              Yes Anna you said it just right, I want to be able to evolve the inputs to the NN itself. I believe we have the NN setup correctly, but we have about 12
                              Message 14 of 21 , Sep 13, 2003
                              • 0 Attachment
                                Yes Anna you said it just right, I want to be able to
                                evolve the inputs to the NN itself. I believe we have
                                the NN setup correctly, but we have about 12 inputs
                                that I would like to evolve? Any thoughts?
                                Thanks
                                Rog


                                --- Anna I Esparcia Alcazar <anna@...> wrote:
                                > Dear Roger, Toni & Bill,
                                >
                                > Ken Sharman and myself did some work on evolving the
                                > topology and weights of
                                > recurrent NNs using GP back in 1995 (you can find
                                > the references in
                                > http://www.iti.upv.es/cas/CASpubs.htm) but if, I
                                > understand it correctly,
                                > the question is about evolving the _inputs_ to the
                                > neural network, not the
                                > NN itself. Perhaps Roger would like to clarify this?
                                >
                                > Anna
                                >
                                > -------------------------------------
                                > Dr Anna I Esparcia-Alcazar
                                > Institut Tecnologic de Informatica
                                > Universitat Politecnica de Valencia
                                > Cami de Vera s/n
                                > 46071 Valencia (Spain)
                                >
                                > Tel. +34 963877069 Fax +34 963877239
                                > -------------------------------------
                                >
                                > -----Mensaje original-----
                                > De: Tony Abou-Assaleh [mailto:taa@...]
                                > Enviado el: miercoles, 10 de septiembre de 2003
                                > 7:32
                                > Para: genetic_programming@yahoogroups.com
                                > Asunto: Re: [GP] NN and GA
                                >
                                >
                                > Hi Roger,
                                >
                                > I have done some work on evolving RNNs. Some of it
                                > you can find on my
                                > website at: http://www.cs.dal.ca/~taa/
                                >
                                > Email me if you have any question about approach
                                > or implementation. I can
                                > make the source code available (AS IS) both in
                                > Java and in C.
                                >
                                > Cheers,
                                >
                                > TAA
                                >
                                > --------------------------------------------------
                                > Tony Abou-Assaleh
                                > Ph.D. Candidate, Faculty of Computer Science
                                > Dalhousie University, Halifax, NS, Canada, B3H 1W5
                                > Fax: 902-492-1517
                                > Email: taa@...
                                > WWW: http://www.cs.dal.ca/~taa/
                                > ---------------------[THE END]--------------------
                                >
                                >
                                > On Tue, 9 Sep 2003, Roger Smith wrote:
                                >
                                > > Does anyone know of or have any recommendations
                                > for
                                > > tutorials outlining GA and evolving inputs to
                                > Neural
                                > > Nets?
                                > > Thanks
                                > > Rog
                                > >
                                >
                                > Your use of Yahoo! Groups is subject to the Yahoo!
                                > Terms of Service.
                                >


                                __________________________________
                                Do you Yahoo!?
                                Yahoo! SiteBuilder - Free, easy-to-use web site design software
                                http://sitebuilder.yahoo.com
                              • Candida Ferreira
                                ... Since you already have the NN architecture and you only need to evolve 12 inputs, it seems to me a very simple problem. Perhaps you could describe the NN
                                Message 15 of 21 , Sep 13, 2003
                                • 0 Attachment
                                  Roger Smith wrote:

                                  > Yes Anna you said it just right, I want to be able to
                                  > evolve the inputs to the NN itself. I believe we have
                                  > the NN setup correctly, but we have about 12 inputs
                                  > that I would like to evolve? Any thoughts?
                                  > Thanks
                                  > Rog
                                  >

                                  Since you already have the NN architecture and you only need to evolve 12 inputs, it seems to me a very simple problem. Perhaps you could describe the NN to us (layers, type of neurons and connectivities) and provide the fitness cases and we could all apply our algorithms to evolve the adaptive parameters and see what we can come up with.

                                  Candida Ferreira

                                  -----------------------------------------------------------
                                  Candida Ferreira, Ph.D.
                                  Chief Scientist, Gepsoft
                                  73 Elmtree Drive
                                  Bristol BS13 8NA, UK
                                  ph: +44 (0) 117 330 9272
                                  http://www.gepsoft.com/gepsoft
                                  http://www.gene-expression-programming.com/author.asp
                                  -----------------------------------------------------------
                                • Bill White
                                  It seems to me then the real problem is variable selection (and possibly initial weights on these as inputs) instead of structural/topological, I would think a
                                  Message 16 of 21 , Sep 17, 2003
                                  • 0 Attachment
                                    It seems to me then the real problem is variable selection (and possibly initial weights on these as inputs) instead of structural/topological, I would think a GA or GP cpuld do this. Our method of Symbolic Discriminant Analysis (SDA) has shown to be very effective at this:

                                    http://www.cs.bham.ac.uk/~wbl/biblio/gp-html/JasonHMoore.html
                                    http://medschool1.mc.vanderbilt.edu/brain_institute/php_files/faculty_blurbs.php?ID=925


                                    Another direction for selection amongst combinatorial restriction is in ant systems:

                                    http://www.cs.bham.ac.uk/~wbl/biblio/gp-html/JasonHMoore.html
                                    http://www.idsia.ch/~luca/detail.htm#ant-q
                                    http://www.google.com/search?hl=en&lr=&ie=ISO-8859-1&q=Ant+Systems+variable+selection&btnG=Google+Search


                                    Roger Smith wrote:
                                    Yes Anna you said it just right, I want to be able to
                                    evolve the inputs to the NN itself.  I believe we have
                                    the NN setup correctly, but we have about 12 inputs
                                    that I would like to evolve? Any thoughts?
                                    Thanks
                                    Rog
                                    
                                    
                                    --- Anna I Esparcia Alcazar <anna@...> wrote:
                                      
                                    Dear Roger, Toni & Bill,
                                    
                                    Ken Sharman and myself did some work on evolving the
                                    topology and weights of
                                    recurrent NNs using GP back in 1995 (you can find
                                    the references in
                                    http://www.iti.upv.es/cas/CASpubs.htm) but if, I
                                    understand it correctly,
                                    the question is about evolving the _inputs_ to the
                                    neural network, not the
                                    NN itself. Perhaps Roger would like to clarify this?
                                    
                                    Anna
                                    
                                    -------------------------------------
                                    Dr Anna I Esparcia-Alcazar
                                    Institut Tecnologic de Informatica
                                    Universitat Politecnica de Valencia
                                    Cami de Vera s/n
                                    46071 Valencia (Spain)
                                    
                                    Tel. +34 963877069 Fax +34 963877239
                                    -------------------------------------
                                    
                                      -----Mensaje original-----
                                      De: Tony Abou-Assaleh [mailto:taa@...]
                                      Enviado el: miercoles, 10 de septiembre de 2003
                                    7:32
                                      Para: genetic_programming@yahoogroups.com
                                      Asunto: Re: [GP] NN and GA
                                    
                                    
                                      Hi Roger,
                                    
                                      I have done some work on evolving RNNs. Some of it
                                    you can find on my
                                      website at: http://www.cs.dal.ca/~taa/
                                    
                                      Email me if you have any question about approach
                                    or implementation. I can
                                      make the source code available (AS IS) both in
                                    Java and in C.
                                    
                                      Cheers,
                                    
                                      TAA
                                    
                                      --------------------------------------------------
                                      Tony Abou-Assaleh
                                      Ph.D. Candidate, Faculty of Computer Science
                                      Dalhousie University, Halifax, NS, Canada, B3H 1W5
                                      Fax:   902-492-1517
                                      Email: taa@...
                                      WWW:   http://www.cs.dal.ca/~taa/
                                      ---------------------[THE END]--------------------
                                    
                                    
                                      On Tue, 9 Sep 2003, Roger Smith wrote:
                                    
                                      > Does anyone know of or have any recommendations
                                    for
                                      > tutorials outlining GA and evolving inputs to
                                    Neural
                                      > Nets?
                                      > Thanks
                                      > Rog
                                      >
                                    
                                      Your use of Yahoo! Groups is subject to the Yahoo!
                                    Terms of Service.
                                    
                                        
                                    
                                    __________________________________
                                    Do you Yahoo!?
                                    Yahoo! SiteBuilder - Free, easy-to-use web site design software
                                    http://sitebuilder.yahoo.com
                                    
                                    ------------------------ Yahoo! Groups Sponsor ---------------------~-->
                                    Buy Ink Cartridges or Refill Kits for Your HP, Epson, Canon or Lexmark
                                    Printer at Myinks.com. Free s/h on orders $50 or more to the US & Canada. http://www.c1tracking.com/l.asp?cid=5511
                                    http://us.click.yahoo.com/l.m7sD/LIdGAA/qnsNAA/7brrlB/TM
                                    ---------------------------------------------------------------------~->
                                    
                                    To unsubscribe from this group, send an email to:
                                    genetic_programming-unsubscribe@yahoogroups.com
                                    
                                     
                                    
                                    Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/ 
                                    
                                    
                                    
                                      


                                    -- 
                                    Nature teaches more than she preaches. There are no sermons in
                                    stones. It is easier to get a spark out of a stone than a moral. 
                                    -John Burroughs, naturalist and writer (1837-1921)
                                    
                                  Your message has been successfully submitted and would be delivered to recipients shortly.