Loading ...
Sorry, an error occurred while loading the content.

204140Re: Universal language for communicating with space aliens

Expand Messages
  • Logan Kearsley
    Jul 26, 2014
    • 0 Attachment
      On 26 July 2014 04:38, Patrik Austin <patrik.austin@...> wrote:
      > I think maybe now I see what you mean and you could be right. There may still be some misunderstandings on my side, but correct me whenever I'm wrong and I'll get back to it a little later when I have the time to read everything more thoroughly.
      >
      > So, essentially you've made a grammar that doesn't have verbs. And when there aren't any verbs, it's pointless to talk about subject, object and adverbial as they are constituents that are supposed refer to the verb.

      Pretty much. It goes a little farther than that, though; there are
      several other languages that don't have things that look like "verbs"
      in the usual sense, but still have some kind of clause-controlling
      element that assigns roles asymmetrically, or other syntactic
      requirements that allow you to consistently distinguish different
      kinds of argument positions. E.g., Kelen, which I mentioned before,
      and which has "relationals" which basically fill in for the purely
      *syntactic* functions of verbs while being semantically bleached and
      forming a small closed class. This is sufficiently odd that it's
      entirely reasonable to conclude that Kelen really doesn't have verbs,
      but it's still possible to distinguish subjects, core objects, oblique
      objects, and clause-modifying "adverbials" (though that nomenclature
      would be odd, given the circumstances).

      Weird-syntax-lang, on the other hand, puts every sub-clausal
      constituent on the same level; it's just a very thing layer of
      notational veneer on top of a particular formulation of predicate
      calculus semantics, where the "parts-of-speech" are one-place
      predicates (content words), two-place predicates (postpositions),
      logical connectives (conjunctions & the not operator), parentheses
      (which was just for convenience; in a real language I'd find a way to
      remove them, probably allowing for ambiguity), and (so far only one)
      modal propositional operator (which *could* be reformulated as a
      two-place predicate over reified relations, but that would take more
      design effort). Not only are there no verbs, but neither is there any
      *other* mechanism to reliably distinguish one argument from another.

      > That was really clever and I must admit I hadn't thought of it. It looks like a solution. The way I understand the sentence is as follows:
      >
      > IF in-forest-tree-fall THEN future-sound-causation, yes?

      Eh, well, I'm not sure what that notation is supposed to represent, so maybe?

      > That's actually not weird at all. It's very logical and computer-friendly. Russian famously omits the copula: eto pravda (это правда) – "this true" for this is true.
      >
      > I could say I want to add the copula to make my analysis neat, but then you can argue that the copula is redundant. True, the computer wouldn't see any use for it.

      Just as an aside, I have noticed "это" apparently being used in a very
      copula-like way in colloquial speech (e.g. "Моя машина - это Вольво."
      -> "My car is a Volvo"; yeah, it's *actually* a cleft, and you can
      tell by the intonation, but it's just *begging* for reanalysis). Kinda
      makes me wonder if "Future Russian" mightn't re-evolve a present-tense
      copula by that mechanism, along with reanalysis of predicate
      adjectives as full-on verbs. That in turn could lead to the
      development of distinct "active verb" vs. "stative predicate" parts of
      speech, or to an analogical leveling that reduces normal verbal
      inflections and turns forms of быть into tense markers (like English
      "will" and "did"). That last option is probably less likely just
      because of how much more common normal verb inflections are vs.
      predicate adjective or predicate adverb constructions, but hey, this
      is conlanging, I can do what I want. :)

      > On the other hand I could point out that the conjunctions can actually be analysed as verbs: " in-forest-tree-fall ifs (is iffing), future-sound-causation thens (is thenning)" so I can go back to using subject, verb, object and adverbial in my analysis. But then you can argue I have no evidence my analysis is better than yours. So it follows that I won't be able to refute your argument.

      A much cleaner correspondence comes from just using "implies":
      "in-forest-tree-falls _implies_ future-sound-causation".
      I might argue that that's odd because then "verbs" in
      weird-syntax-lang would only ever take full clauses as arguments. On
      the other hand, that's exactly how the English verb "implies" works
      (it takes clauses or anaphors for clauses- propositions, anyway). In
      fact, most logical connectives can be rendered as verbs as well as
      conjunctions in English, with a little creativity: "accompany" for
      "and", "exclude" for "not" (or "and not"), etc. After all that,
      though, I'd have to point out that the connectives act much *more*
      like conjunctions, and it's simplest to just assume that that is
      indeed what they are.

      > Am I anywhere near?

      Yup!

      At this point, I feel it might be useful to bring up the concept of
      "vector spaces" and "basis sets", where a basis set is a set of
      vectors that can be combined to create any point in the space. The
      simplest example of a vector space is just regular 3D space, which
      requires 3 basis vectors- one per dimension- to describe. However, the
      basis set is not unique: you can use {x, y, z} (cartesian), or {r,
      phi, theta} (spherical), or {r, y, theta} (cylindrical), or any number
      of other possible transformed sets.
      But, we can have more abstract vector spaces. Like, say, the space of
      all possible computations. There are tons of equivalent basis sets for
      that- sets of operations that for Turing-complete systems, where the
      members of the set can be combined by mathematical operations to reach
      every point in the space (describe every possible computation). Oddly,
      it turns out the minimal basis set for computation only has one
      member, but there are still lots of options for it; e.g., lambda
      abstraction, the NAND operator, etc.

      Now, what does that have to do with conlanging? Well, we can describe
      the set of all possible syntactic structures as a vector space. (We
      can also describe semantics as an infinite-dimensional vector space,
      but infinite-dimensional things are tricky to deal with.) So, it's
      gotta have basis sets- collections of basic syntactic structures that
      can be combined to form all the possible ones. Not all languages
      display every possible syntactic structure in every theory of syntax,
      so some languages are actually using an incomplete basis set defining
      a sub-space of possible syntax. And I'd bet that few if any natural
      languages actually encode a minimal basis set.

      But let's say you found a minimal basis set of morphosyntactic
      operations that could describe all of human language. Better, let's
      say you found a minimal basis set that could describe all possible
      logically lossless syntaxes. I would bet that you could not prove any
      such set to be unique. Actually, I've got an existence proof that it
      won't be unique- already, we have predicate logic and set semantics as
      mathematically equivalent but definitely different systems for
      describing the compositional operations that syntax encodes. I'm not
      willing to bet on this next one, but I'd at least be surprised if it
      could be proven that there was a finite, enumerable set of possible
      minimal basis sets. As a result, I would expect to find that genuinely
      alien languages use organizational features that we won't have thought
      of as linguistically possible before encountering them. Now, I could
      be wrong about that. Maybe there are practical constraints from
      computational complexity, ease of modelling the real world,
      error-correction mechanisms, and so forth that will end up making
      alien languages a lot more human-like than they could be if
      constrained only by pure information theory and logic. But without
      some truly breakthrough advances in formal semantics and theoretical
      syntax, which are likely not forthcoming since Real Linguists tend to
      be concerned just with studying human languages, I don't think we can
      safely *assume* anything at all about aliens' linguistic abilities
      apart from basic information theory and logic. The space of possible
      languages is just too huge.

      -l.
    • Show all 118 messages in this topic