Loading ...
Sorry, an error occurred while loading the content.

Re: Iterating through a huge JSON variable - freezes FF

Expand Messages
  • carl.harroch
    Hello Mathew, ... var jsonTest = Class.create(); jsonTest.prototype = { allData: [{attributes: {name: , id: }}], myData: [{attributes: {name: , id:
    Message 1 of 10 , Feb 7 8:08 AM
    • 0 Attachment
      Hello Mathew,

      My javascript looks something like the following (with prototype.js):

      ---
      var jsonTest = Class.create();
      jsonTest.prototype = {

      allData: [{attributes: {name: "", id: ""}}],
      myData: [{attributes: {name: "", id: ""}}],

      initialize: function() {

      this.dataTemplate = new Template('<span class="#{dataClass}"
      style="color: #{dataColor}; padding: 20px;" > #{dataName} </span>');

      var initData = new Ajax.Request('/data.json', {
      method: 'get',
      asynchronous: 'false',
      onLoading: function () { this.loadingDiv.show(); }.bind(this),
      onFailure: function () { alert('Something went wrong...') },
      onSuccess: this.initSearch.bind(this),
      onComplete: this.showData.bind(this),
      });
      },

      initSearch: function (originalRequest){
      this.allData = eval('(' + originalRequest.responseText + ')');
      this.myData = eval('(' + originalRequest.responseText + ')');
      },

      showData: function () {
      arr = $w('class1 class2 class3 class4 class5');
      colore = $w('#99FF00 #333333 #FF3298');
      this.myData.each(function(item, index) {
      this.resultContainer.innerHTML +=
      this.dataTemplate.evaluate({dataClass:
      arr[Math.floor(Math.random()*6)],
      dataColor:
      colore[Math.floor(Math.random()*4)],
      dataName:
      item.attributes.name});
      }.bind(this));
      }
      }
      ---

      And my data.json:

      ---
      [{attributes: {name: "ca", id: "2"}}, {attributes: {name:
      "111ffrrferf", id: "3"}},.... * 2000 records
      ---

      I am pretty sure I know where the error comes from. It is due to the
      this.resultContainer.innerHTML... I tried briefly to save it to a temp
      before exchanging the innerHTML and I could see a real change in the
      time it takes to load the json.

      Do you beleive my code is a good approach on generating html from
      large JSON?

      I still beleive it is not a clean solution. I am very new to
      Javascript. Multi-threading will be needed in the features especially
      with the usage of AJAX and controller logic being more and more on the
      client side. How does the AJAX request approach the subject?

      Anyway, will continue to work on this and keep you updated.

      Thanks,
      Carl

      --- In json@yahoogroups.com, "Matthew Morley" <WickedLogic@...> wrote:
      >
      > The warning is meant to protect users from scripts that might be or are
      > abusing the system resources, the solution is not to get rid of the
      message
      > but to examine your code. I've got a few objects around 1k nodes which
      > function fine, but those are all unique fields, not records.
      >
      > Could you post some code where you are doing the conversion from
      JSON into
      > the Javascript objects, as well as your loop?
      > Where in the code process do you get the message?
      > Depending on your data, perhaps you are doing too much in general
      and should
      > break the task down into smaller steps?
      >
      > --
      > Matthew P. C. Morley
      >
      >
      > [Non-text portions of this message have been removed]
      >
    • Michal Migurski
      ... Nitpick: this isn t valid JSON, get those hash keys quoted. ... Iterating over 2000 of anything in Javascript is going to cause problems for most browsers
      Message 2 of 10 , Feb 7 8:49 AM
      • 0 Attachment
        > And my data.json:
        >
        > ---
        > [{attributes: {name: "ca", id: "2"}}, {attributes: {name:
        > "111ffrrferf", id: "3"}},.... * 2000 records

        Nitpick: this isn't valid JSON, get those hash keys quoted.


        > I am pretty sure I know where the error comes from. It is due to the
        > this.resultContainer.innerHTML... I tried briefly to save it to a temp
        > before exchanging the innerHTML and I could see a real change in the
        > time it takes to load the json.
        >
        > Do you beleive my code is a good approach on generating html from
        > large JSON?
        >
        > I still beleive it is not a clean solution. I am very new to
        > Javascript. Multi-threading will be needed in the features especially
        > with the usage of AJAX and controller logic being more and more on the
        > client side. How does the AJAX request approach the subject?

        Iterating over 2000 of anything in Javascript is going to cause
        problems for most browsers - visit a heavily-commented page on Digg
        for another example of greedy, blocking loops in a production site. =)

        Try to break this problem down into smaller chunks, as Matthew
        suggested. For example, divide the 2000-element request into 100-
        element requests, and chain them together so that a new request is
        fired only after the previous response has been received and handled.
        It will take longer overall and require more trips to and from the
        server, but giving the browser some breathing room to address user
        input or UI tasks will make it feel snappier. This kind of
        asynchronous (the "A" in "Ajax"), chunked-out approach gets you most
        of the benefits of multithreading without the headaches.

        -mike.

        ----------------------------------------------------------------
        michal migurski- contact info and pgp key:
        sf/ca http://mike.teczno.com/contact.html
      • Mark Ireland
        I wonder if something like this:
        Message 3 of 10 , Feb 7 4:41 PM
        • 0 Attachment
          I wonder if something like this:

          {"RECORDCOUNT":10,"UNIQUEIDS":["js63013","js63009","js63028","js63185","js63039","js63044","js63228","js63246","js63077","js63227"],"COLUMNNAMES":["EM","EY","ISSILO","ISWEATHER","SM","STATIONID","STATIONNAME","SY","X","Y"],"js63013":{"CURRENTROW":1,"EM":8,"EY":2006,"ISSILO":0,"ISWEATHER":1,"SM":4,"STATIONID":63013,"STATIONNAME":"BERAMBING","SY":1943,"X":166,"Y":97},
          . . ....
          and so on
          . . . ...

          "js63227":{"CURRENTROW":10,"EM":10,"EY":2006,"ISSILO":1,"ISWEATHER":1,"SM":3,"STATIONID":63227,"STATIONNAME":"WENTWORTH
          FALLS COUNTRY CLUB","SY":1967,"X":124,"Y":187}}

          would be better like this:

          {"RECORDCOUNT":10,"UNIQUEIDS":["js63013","js63009","js63028","js63185","js63039","js63044","js63228","js63246","js63077","js63227"],"COLUMNNAMES":["ENDMONTH","ENDYEAR","ISSILO","ISWEATHER","STARTMONTH","STATIONID","STATIONNAME","STARTYEAR","XCOORD","YCOORD"],"js63013":{"CURRENTROW":1,"k1":8,"k2":2006,"k3":0,"k4":1,"k5":4,"k6":63013,"k7":"BERAMBING","k8":1943,"k9":166,"k10":97},
          . . ....
          and so on
          . . . ...

          "js63227":{"CURRENTROW":10,"k1":10,"k2":2006,"k3":1,"k4":1,"k5":3,"k6":63227,"k7":"WENTWORTH
          FALLS COUNTRY CLUB","k8":1967,"k9":124,"k10":187}}

          with some code to lookup the columnames.

          _________________________________________________________________
          Advertisement: Fresh jobs daily. Stop waiting for the newspaper. Search now!
          www.seek.com.au
          http://a.ninemsn.com.au/b.aspx?URL=http%3A%2F%2Fninemsn%2Eseek%2Ecom%2Eau&_t=757263760&_r=Hotmail_EndText_Dec06&_m=EXT
        • mertsakarya@hotmail.com
          Once we ve had long running script on the client-side, which took about 1 minutes to execute/parse/display the data(a two dimensional array with thousands of
          Message 4 of 10 , Feb 7 10:17 PM
          • 0 Attachment
            Once we've had long running script on the client-side, which took about 1 minutes to execute/parse/display the data(a two dimensional array with thousands of rows).
            First of all, this shouldn't happen, but if you can't avoid it, do it asychronously.

            What we did was, we created an "interval", and parsed them by the hundreds (less or more for your situation) each time.
            This increased responsiveness of the page.

            The idea is; use asynchronousity while downloading the data and if the data is huge, use asynchronousity again to parse/display the data.

            Mert


            ----- Original Message -----
            From: carl.harroch
            To: json@yahoogroups.com
            Sent: Wednesday, February 07, 2007 2:07 PM
            Subject: [json] Iterating through a huge JSON variable - freezes FF


            Hello,

            I am trying to iterate through a big JSON variable (about 1500 nodes).
            It works but FF pops up with the message saying the script is not
            responding (A script on this page may be busy... do you want to stop
            the script, debug, continue). If I select continue, it works fine. It
            is just that the iteration takes a bit of time to go through all the
            nodes. Is there a way to avoid the above? I searched a bit the net and
            it is not very clear how to do so as JS is not multithreaded from what
            I could gather. However, I am sure, there could be some asynchron
            iteration releasing the cpu from time to time to ensure nothing else
            needs to be done on the page.

            Any ideas?

            /Carl





            [Non-text portions of this message have been removed]
          • mertsakarya@hotmail.com
            One more thought, Why not use [ [ , ],... (array) notation instead of [{attributes: {name: , id: }}], ? Mert ... From:
            Message 5 of 10 , Feb 7 10:21 PM
            • 0 Attachment
              One more thought,
              Why not use [ [<name, string>, <id, string>],...
              (array) notation instead of [{attributes: {name: "", id: ""}}], ?

              Mert
              ----- Original Message -----
              From: carl.harroch
              To: json@yahoogroups.com
              Sent: Wednesday, February 07, 2007 2:07 PM
              Subject: [json] Iterating through a huge JSON variable - freezes FF


              Hello,

              I am trying to iterate through a big JSON variable (about 1500 nodes).
              It works but FF pops up with the message saying the script is not
              responding (A script on this page may be busy... do you want to stop
              the script, debug, continue). If I select continue, it works fine. It
              is just that the iteration takes a bit of time to go through all the
              nodes. Is there a way to avoid the above? I searched a bit the net and
              it is not very clear how to do so as JS is not multithreaded from what
              I could gather. However, I am sure, there could be some asynchron
              iteration releasing the cpu from time to time to ensure nothing else
              needs to be done on the page.

              Any ideas?

              /Carl





              [Non-text portions of this message have been removed]
            • carl.harroch
              ... It was generated with the JSON addin to RoR. Will look into that later on. I try to avoid the trip back to the server and download the list just once.
              Message 6 of 10 , Feb 8 4:41 AM
              • 0 Attachment
                > Nitpick: this isn't valid JSON, get those hash keys quoted.
                It was generated with the JSON addin to RoR. Will look into that later on.

                I try to avoid the trip back to the server and download the list just
                once. Could it be better to do the following:

                1. Download the entire JSON
                2. Iterate through the first 100 and break
                3. back to step 2 until the entire JSON has been iterated


                --- In json@yahoogroups.com, Michal Migurski <mike-jsonphp@...> wrote:
                >
                > > And my data.json:
                > >
                > > ---
                > > [{attributes: {name: "ca", id: "2"}}, {attributes: {name:
                > > "111ffrrferf", id: "3"}},.... * 2000 records
                >
                > Nitpick: this isn't valid JSON, get those hash keys quoted.
                >
                >
                > > I am pretty sure I know where the error comes from. It is due to the
                > > this.resultContainer.innerHTML... I tried briefly to save it to a temp
                > > before exchanging the innerHTML and I could see a real change in the
                > > time it takes to load the json.
                > >
                > > Do you beleive my code is a good approach on generating html from
                > > large JSON?
                > >
                > > I still beleive it is not a clean solution. I am very new to
                > > Javascript. Multi-threading will be needed in the features especially
                > > with the usage of AJAX and controller logic being more and more on the
                > > client side. How does the AJAX request approach the subject?
                >
                > Iterating over 2000 of anything in Javascript is going to cause
                > problems for most browsers - visit a heavily-commented page on Digg
                > for another example of greedy, blocking loops in a production site. =)
                >
                > Try to break this problem down into smaller chunks, as Matthew
                > suggested. For example, divide the 2000-element request into 100-
                > element requests, and chain them together so that a new request is
                > fired only after the previous response has been received and handled.
                > It will take longer overall and require more trips to and from the
                > server, but giving the browser some breathing room to address user
                > input or UI tasks will make it feel snappier. This kind of
                > asynchronous (the "A" in "Ajax"), chunked-out approach gets you most
                > of the benefits of multithreading without the headaches.
                >
                > -mike.
                >
                > ----------------------------------------------------------------
                > michal migurski- contact info and pgp key:
                > sf/ca http://mike.teczno.com/contact.html
                >
              • Michal Migurski
                ... Yuck, file a bug report. There s no excuse for generating javascript and calling it JSON. ... Maybe, though it depends on how long it takes to parse the
                Message 7 of 10 , Feb 8 9:15 AM
                • 0 Attachment
                  > > Nitpick: this isn't valid JSON, get those hash keys quoted.
                  > It was generated with the JSON addin to RoR. Will look into that
                  > later on.

                  Yuck, file a bug report. There's no excuse for generating javascript
                  and calling it JSON.


                  > I try to avoid the trip back to the server and download the list just
                  > once. Could it be better to do the following:
                  >
                  > 1. Download the entire JSON
                  > 2. Iterate through the first 100 and break
                  > 3. back to step 2 until the entire JSON has been iterated

                  Maybe, though it depends on how long it takes to parse the entire
                  message - there's an upper limit of reasonability to this, too. You
                  don't want to download a 10,000 item array and parse that in one go,
                  or discover the array that presents no problems on your browser but
                  brings an older or slower computer to its knees.

                  I wrote a quick blog post specifically about chunking this kind of
                  task out, with some tiny example code:
                  http://mike.teczno.com/notes/polite-loops.html

                  -mike.

                  ----------------------------------------------------------------
                  michal migurski- contact info and pgp key:
                  sf/ca http://mike.teczno.com/contact.html
                • carl.harroch
                  Thanks Mike, I beleive I will go for your last solution. Currently I am working on several part of my site which does not permit a lot of time on the iterating
                  Message 8 of 10 , Feb 15 4:02 AM
                  • 0 Attachment
                    Thanks Mike,

                    I beleive I will go for your last solution. Currently I am working on
                    several part of my site which does not permit a lot of time on the
                    iterating part. As soon as I manage to test some different setup on
                    different, I will post back a summary of my founding.

                    I keep you updated,
                    Carl

                    --- In json@yahoogroups.com, Michal Migurski <mike-jsonphp@...> wrote:
                    >
                    > > > Nitpick: this isn't valid JSON, get those hash keys quoted.
                    > > It was generated with the JSON addin to RoR. Will look into that
                    > > later on.
                    >
                    > Yuck, file a bug report. There's no excuse for generating javascript
                    > and calling it JSON.
                    >
                    >
                    > > I try to avoid the trip back to the server and download the list just
                    > > once. Could it be better to do the following:
                    > >
                    > > 1. Download the entire JSON
                    > > 2. Iterate through the first 100 and break
                    > > 3. back to step 2 until the entire JSON has been iterated
                    >
                    > Maybe, though it depends on how long it takes to parse the entire
                    > message - there's an upper limit of reasonability to this, too. You
                    > don't want to download a 10,000 item array and parse that in one go,
                    > or discover the array that presents no problems on your browser but
                    > brings an older or slower computer to its knees.
                    >
                    > I wrote a quick blog post specifically about chunking this kind of
                    > task out, with some tiny example code:
                    > http://mike.teczno.com/notes/polite-loops.html
                    >
                    > -mike.
                    >
                    > ----------------------------------------------------------------
                    > michal migurski- contact info and pgp key:
                    > sf/ca http://mike.teczno.com/contact.html
                    >
                  Your message has been successfully submitted and would be delivered to recipients shortly.