- Apr 1, 2002Just sharing some views on distributed AI, and trying to get some
support against using XML as a good technology to do this,
AI made The Economist recently.
and The Economist predicts a strong future AI prescience:
Why is this interesting? Because, by definition if The Economist
plays to it... it has to be big enough, at least in perception.
These are some of the drivers:
1) "The semantic Web" and related technologies:
2) Darpa with DAML:
3) Microsoft with .NET, check:
Actively advertising "intelligent" technologies.
4) Intelligence required for B2B, ebXML, RosettaNet, EAI, etc.
EAI/Workflow: App Servers, EAI, Queues (MQ, Open JMS, etc.), Web Services,
Workflow, and Business Process Management (Vitria, etc.)
5) IBM's biggest iron is advertised as "intelligent":
6) Lifestreams, Linda on the Web, etc.; all requiring some basic
AI techniques like pattern-matching, and distributed agents that
know about ontologies.
So, AI seems to be coming back from many but _major_ different areas:
Tim Berners-Lee (inventor of HTTP/HTML)
Business Standards (B2B, ebXML, EAI, RosettaNET, etc.)
Research Projects (aLife, Biological Metaphors, Digital Life)
But help me cut through some Gordian Knots... why does AI have to
be implemented through RPC, client-server, XMLish technologies, that:
1) have many layers of bloat (serializations/deserializations),
2) bring discomfort and confusion by introducing
"disconnected layered languages", and
3) don't have the appropriate semantics, facilities, libraries
and power to do AI jobs?
As early as 1975, (written in 1975, published in 1982), "intelligent"
business exchanges have been proposed like the "The Common Business
where basic exchanges like:
(REQUEST-QUOTE (ADJECTIVE (PENCILS #2) YELLOW)
have replies like:
(WE-QUOTE (OUR-STOCK-NUMBER A7305)
Even KQML and other ACLs are LISP-like.
You can also do this with X12 and now XML, but using a LISP-like syntax,
we can _also_ send rules, computable things (classes, functions, patterns,
etc.), do pattern-matching, send/share ontologies, do knowledge exchanges,
etc. So, imo, the infrastructure that LISP provides is superior
to do AI because it:
1) provides a larger number of existing resources available: libraries,
programs, etc.; for:
a) knowledge representation
c) logical programming
d) expert systems
e) genetic programming
f) game playing (plans, strategies, intentions, actions, etc.)
g) parsing natural languages
this is important we want to implement:
2) requires the least amount of conversions
(serialization/deserialization) when the app servers are
3) provides the greater amount of computational power
4) it is more intuitive since the parsing language can be
the same as the exchange language.
To me, it doesn't make any sense to reinvent the AI wheel with
XMLish technologies ..... this may in fact contribute
to the second commercial failure of AI,
- Next post in topic >>