Loading ...
Sorry, an error occurred while loading the content.

AI and NLP...

Expand Messages
  • Alan Grimes
    om On this occasion of the 26th aniversary of my living in my parent s house, ( =( ) I want to share some of my latest underinformed thinking about AI with a
    Message 1 of 3 , Dec 31, 2003
    • 0 Attachment
      om

      On this occasion of the 26th aniversary of my living in my parent's
      house, ( =( ) I want to share some of my latest underinformed thinking
      about AI with a focus on NLP which seems to be the hot topic of the day.

      The dogma with NLP, as with many other types of AI seems to be that "if
      we add enough rules it will work." or, more subtly, "If we simulate what
      we beleive to be the top-level algorithm of the brain well enough it
      will work". The first approach and even the second is doomed to failure.

      The origin of this dogma is found in a variation of the same dualistic
      thinking behind a vast array of human expression. It can be found
      everywhere from the religeous beleif in a soul, to the uploader's
      equally fanciful beleif that the mind can be seperated from body through
      technology. In this case dualism means that there is a part of the brain
      called the "higher functions" located exclusively in the cerebral cortex
      which are entirely seprable from the rest of the brain which are
      designated as "lower functions".

      What is overlooked is that the human expressions that our NLP systems
      have the most trouble dealing with are, infact, "telemetry data" about
      the state of these so-called lower functions. Only a being which has an
      internal state roughly analogous to what we call "happiness" can
      understand the english sentance "I am happy." in any meaningful way.

      Therefore the projects that are most likley to achieve some level of NLP
      are those which place an emphasis on the brain's varrious subsystems
      such as is required to produce androids.


      --
      President Bush's *head* is _F_L_A_T_.

      http://users.rcn.com/alangrimes/
    • Bob Mottram
      Congratulations upon your 26 years of residence! AI researchers have pondered over the problems of language understanding for decades. My own observation is
      Message 2 of 3 , Jan 2, 2004
      • 0 Attachment
        Congratulations upon your 26 years of residence!

        AI researchers have pondered over the problems of language
        understanding for decades. My own observation is that most
        researchers definition of "language" is specifically written english
        text. However, "language" is really a broader spectrum of skills
        beyond just reading of symbols and manipulation of written text.
        Things like body and face movements and prosody all make up a
        significant amount of human communications.

        Being able to read and write is a very important skill - especially
        if you want to navigate the internet - but it's not the only
        communications skill and indeed our ability to interpret written text
        may in turn be heavily dependent upon more direct sensory experiences
        which don't involve text processing. The narrow concentration upon
        text processing is probably a legacy of Alan Turing's original
        thought experiment of an immitation game, where he imagined only
        teletype machines could be used. In the modern world when you look
        at advertising, movies, TV and so on you can see many examples of the
        use of non-witten communication in order to try to influence our
        decision making.

        - Bob
      • Erik Starck
        Speaking of NLP, complexity and the limits of computational intelligence, this paper: http://www.cs.lth.se/home/Bertil_Ekdahl/publications/CCbeS.pdf argues
        Message 3 of 3 , Jan 2, 2004
        • 0 Attachment
          Speaking of NLP, complexity and the limits of computational
          intelligence, this paper:
          http://www.cs.lth.se/home/Bertil_Ekdahl/publications/CCbeS.pdf
          argues that "[the] lack of a language implies also that a computer
          cannot have a theory of mind,which is necessary in order to be able to
          know that other individuals may have different views of reality, that
          is, that my beliefs not necessarily are shared by other individuals."

          In short, it says that machines can never do what humans do.


          --
          Erik S.
        Your message has been successfully submitted and would be delivered to recipients shortly.