Some comments from the peanut gallery:
> [1-byte ASCII marker indicating type][4-byte int32 size][binary data]
1. I presume the only reason for making this a signed number is to just
be consistent with the processing binary representations of numbers?
2. I presume the size is strictly the number of bytes/octets; not the
number of UTF-8 characters in a string, for example.
> The only difference from JSON being that "Number" is broken out into:
> int32, int64 and double types for the purposes of making parsing of
> the values as efficient as possible in Java, C, C#, Python, Erlang,
> PHP and any other language that has multiple concepts of the different
> types of numbers it can represent.
There are numbers that JSON can represent that cannot be represented
(only roughly approximated) by your selected numerical representations.
Instead of twos-complement and IEEE 754 (which I think you are
implying), perhaps consider BCD with an explicit decimal position. This
would support any possible number representation that JSON can with its
Number type, and we can also avoid endian bike shedding while we're at
[Non-text portions of this message have been removed]