1819Re: Operator-overloaded JSON API: best way to handle arrays?
I think implicitly creating arrays is quite tricky. For example, if you look at
jsondoc["foo"]["bar"] = "harhar";
the type of the key (String or Number) determines if an object or an array will be created...
I think it is better force explicit creation of arrays and objects, that will not lead to misconceptions.
--- In firstname.lastname@example.org, Rob Meijer <pibara@...> wrote:
> A while ago I wrote the JsonMe++ wrapper library. Currently a wrapper
> for the parsing part of the glib JSON C library.
> JsonMe++ provides an API that heavily uses operator overloading, cast
> operators and value semantic proxies to provide
> a bit of a scripting language like API for parsing JSON.
> Now I would like to continue working on JsonMe++ by expanding the API
> with the possibility of also creating JSON
> rather than just parsing it. Handling objects seems quite simple and
> natural to do simply by adding some assignment
> operators to the API.
> Handling arrays however has got me a bit confused. Do I stick to an
> API with as little as possible 'named', node level methods?
> It does seem to have a lot of merit to keep the node API without named
> methods, but this will require me to handle
> sparse arrays, what might or might not be a good idea.
> Basically I could make the API work like:
> jsondoc["foo"]["bar"] = "harhar";
> And let, if jsondoc was empty still, create a new json object with a
> foo memeber that is a sparse array with an empty index 0 element
> and an object index 1 element that has a "bar" member with value "harhar".
> Alternatively I could make the API so that the above would need to be
> expressed as:
> jsondoc["foo"].append()["bar"] = "harhar"
> I don't know if one of these would be better, or if there would be a
> third alternative that would be more suitable.
> I would be very interested in learning what others think.
- << Previous post in topic Next post in topic >>