Loading ...
Sorry, an error occurred while loading the content.

Metaphor in the Semantic Web -- Genetic Algorithms

Expand Messages
  • david_dodds_2001
    Metaphor in the Semantic Web -- Genetic Algorithms PART ONE - Neurons and other non-RDF Semantic Processors { PART TWO - Non-Stationarity and Meta-Objective
    Message 1 of 1 , Dec 31, 2007
    View Source
    • 0 Attachment
      Metaphor in the Semantic Web -- Genetic Algorithms

      PART ONE - Neurons and other non-RDF Semantic Processors

      { PART TWO - Non-Stationarity and Meta-Objective
      Function in the Simple Genetic Algorithm
      - (future posting) }

      While the W3C has its attention riveted on RDF and RDF-based
      representations there are other ways of representing so-called
      semantic data. Most everyone has at least heard of artificial neural
      networks (aka 'nets' or 'neurons'). The basic element of all actual
      artificial neural nets (ANN), the so-called neuron, has only the
      remotest resemblance to any real biological neuron. In some respects
      the current day artificial neuron used in ANNs is software based
      evolution of the McCulloch-Pitts electronic circuit, which was called
      a 'flip-flop' in its day, or by the technical term 'bi-stable
      multivibrator'. It was literally a binary switch.

      Well anyone who actually has competent knowledge of real biological
      neurons knows that this binary switch is a farcical cartoonish
      representation of the functioning, externally or internally, of actual
      biological neurons. Biological neurons are simultaneously analog and
      digital. Because the physical circuits that were used in those days to
      implement these electronic switches were so large, remember the
      original transistor was a can about the size of a green-pea with three
      wires ('leads') sticking out the bottom and the other discreet
      components used in the circuits (resistors, capacitors, inductors,
      diodes, etc) were an order of magnitude or more larger than that, that
      it took a housing the size of a refridgerator to put together a
      network of even a few thousand such switches.

      If you know anything about network theory you know that is about the
      smallest network where anything of interest to humans might happen.

      With the onset of large-scale integeration (lsi) the microprocessor
      came about and it was seen that greater flexibility was to be had by
      replacing hardware switches by means of software abstractions. Latter
      day 'neurons' are software phantasms described by 'logic', just as
      real analogue computers and their maze of corded patch-boards were
      replaced with equations written in software (yes in, ready for it,
      FORTRAN! , CSMP, MAD etc). These days neural networks are typically
      implemented as matrix arithmetic of rather simplistic elements. It is
      more the functioning, by interaction, of the network, specifically the
      networked elements that produces any occurring neural network
      behaviour, than is caused by whatever level of complexity and
      processing in EACH neuron. In some respects one might like to compare
      this with a Markov Graph (machine), or a state machine. Contrasting
      this view is the HMM (Hidden Markov Model) where each element of the
      graph (or network) has a complex functioning "interior" , a 'hidden'
      model which is much more complex than a simple Newtonian function
      (such as two-state switch). What do we get if we replace the
      stochastic elements of the HMM with Lotfi Zadeh's Usuality components?
      (A context sensitive 'similarity' based graph instead of a 'crisp' and
      'identity' values type one).

      In this next section we look at a paper (url=)
      http://arantxa.ii.uam.es/~alfonsec/docs/simul98f.htm .

      An object-oriented continuous simulation language and its use for
      training purposes.

      Manuel Alfonseca, Juan de Lara, Estrella Pulido;
      Universidad Autonoma de Madrid, Dept. Ingenieria Informatica;
      {Juan.Lara, Manuel.Alfonseca, Estrella.Pulido}@... .

      Here is an excerpt from that paper showing some listings of code, note
      that it shows an object oriented update of the CSMP language. {CSMP
      Continuous Systems Modelling Program}

      Listing 1 gives an example of the declaration of a class.

      * Definition of Planet class *
      CLASS Planet {
      NAME name
      DATA M, X0, Y0, XP0, YP0, FI
      * Calculations for a planet *
      * Distance to the Sun
      R2 := X*X+Y*Y
      R := SQRT(R2)
      Y1 := Y*CFI
      Z := Y*SFI
      * Mutual influences
      * The Sun on this planet
      APS := G*MS/R2/R
      * This planet on the Sun
      ASP := G*M/R2/R
      XPP := -(ASP+APS)*X
      YPP := -(ASP+APS)*Y
      XP := INTGRL(XP0,XPP)
      YP := INTGRL(YP0,YPP)
      X := INTGRL(X0,XP)
      Y := INTGRL(Y0,YP)
      * Mutual actions of two planets *
      * Distance to another planet
      DPP2 :=
      DPP := SQRT(DPP2)
      * Influences
      * The other planet on the Sun
      ASP1 := G*Planet.M/Planet.R2/Planet.R
      * The other planet on this planet
      APP1 := G*Planet.M/DPP2/DPP
      * Coordinate conversion
      Y2 := Planet.Y*COS(Planet.FIR-FIR)
      * Actual action of the planet
      XPP += APP1*(Planet.X-X) - ASP1*Planet.X
      YPP += APP1*(Y2-Y) - ASP1*Y2
      * Other data *
      PRINT R
      PLOT Y,X
      FINISH R=.0001

      Listing 1: Declaration of a class in OOCSMP

      Listing 2 gives an example of the construction of several objects of
      class Planet, assuming that the definition of this class is contained
      in file "Planet.csm".

      * Universal data *
      DATA G:=0.00011869, PI:=3.141592653589793
      * Sun mass
      DATA MS:=332999
      INCLUDE "Planet.csm"
      * Actual planets *
      Planet Mercury("Mercu",0.055271,-0.3871, 0, 2.078, -9.892, 7.004)
      Planet Venus ("Venus",0.81476, 0.7233, 0, 0.051, 7.39, 3.394)
      Planet Earth ("Earth",1, 0, 1, -6.2899, 0.107, 0 )
      Planet Moon ("Moon", 0.01235, 0, 0.9975,-6.0783, 0.107, 0 )
      Planet Mars ("Mars", 0.10734, 1.5233, 0, 0.476, 5.071, 1.85 )
      Planet Apollo ("Apolo",1957E-14, 0, 1.4849,-4.253, 2.915, 6.4 )
      Planet Jupiter("Jupit",317.94, 0, -5.2028, 2.754, 0.131, 1.308)
      Planet Saturn ("Satur", 95.181, 9.5388, 0, 0.113, 2.034, 2.488)
      Planet Uranus ("Urano", 14.535, 0, 19.1914,-1.431, 0.067, 0.774)
      Planet Neptune("Neptu", 17.135,-30.0611, 0, 0.0117,-1.147, 1.774)
      Planet Pluto ("Pluto",0.0021586,0, -39.5294, 0.971 , 0.249,17.148)
      Planet System := Mercury, Venus, Earth, Moon, Mars, Apollo, Jupiter,
      Saturn, Uranus, Neptune, Pluto
      * Time intervals and other data *
      TIMER delta:=.0005, FINTIM:=2, PRdelta:=.1, PLdelta:=.01

      Listing 2: Simulating the solar system in OOCSMP

      A geostationary satellite

      A geo-stationary satellite which keeps constant its distance to the
      Earth has been simulated using the Planet class above-defined without
      any change.

      * Universal data *
      DATA G:=4.979E-16, PI:=3.141592653589793
      * Earth data
      DATA MS:=5.979E21
      INCLUDE "Planet.csm"
      * Actual satellites *
      Planet Geost ("Geost",1, 0, 42.24637, -265.462, 0, 0 )
      Planet Moon ("Moon", 7.384E19, 0, 392.1, -86.65, 4.4, 0 )
      Planet System := Geost, Moon
      * Time intervals and other data *
      TIMER delta:=.0005, FINTIM:=2, PRdelta:=.1, PLdelta:=.01
      PRINT Geost.X

      Listing 3: Simulating a geostationary satellite

      in PART TWO of Metaphor in the Semantic Web -- Genetic Algorithms
      we see the relevance of the OOCSMP code listed above, and we read
      about Genetic Algorithms in the paper 'Non-Stationarity and
      Meta-Objective Function in the Simple Genetic Algorithm'. What is
      discussed is the different means of representing semantic information.
      It is part of the overall discussion about what is metaphor in the
      semantic web. Also in the next part we see how ontology can be applied
      to symbolic computing such as Genetic Algorithms.
      Metaphor in computing brings with it the capability of programs having
      ssecond order metaprogramming capability. What that is is discussed in
      PART TWO.
      previously Copyright David Dodds
    Your message has been successfully submitted and would be delivered to recipients shortly.