Loading ...
Sorry, an error occurred while loading the content.

Re: [tekkotsu_dev] Recording scenes in MIRAGE

Expand Messages
  • Ignacio Herrero Reder
    I finally managed to print a frame number both at the MIRAGE windows and at the on-board camera view, so is easier to relate (there is a little
    Message 1 of 7 , Jul 17 3:14 AM
    • 0 Attachment
      I finally managed to print a frame number both at the MIRAGE windows and at the on-board camera view, so is easier to relate (there is a little mismatch...about 2 frames, as I just restart a counter when client/robot is connected, and increase that counter inside "renderImage" function; probably cameras need 2 frames to setup, so counter starts in Mirage when cameras are not ready... ), but I can cope with it...
      Probably I could add the framenumber to the client ParseTree in order to synchronize also frame information , but I'm not sure it's worth the effort...

      Regarding to recording scenes, everybody at the internet recommends just recording with a usual desktop recorder application, and, in order to change PoV inside the scene....well, you should use and record different cameras at the same time which is very time consuming so the only way I can think is just record the actions you do and repeat those same actions but changing the camera position....

      Continue working with  the "vector direction arrow"....It would be interesting to have another top view of the map to see an drecord the trajectory of the robot...


      Ignacio Herrero Reder            / Tl. +34-95.213.71.60
      Dpto. Tecnologia Electronica     / Fax: +34-95.213.14.47 
      E.T.S. Ing. Telecomunicacion     / nhr@... 
      Universidad de Malaga            / http://www.dte.uma.es
      Campus Universitario de Teatinos 
      29010 Malaga, Spain  
      El 24/06/2013 11:31, Ignacio Herrero Reder escribió:
       

      Hello. I wonder if is there any way to record scenes of simulations in MIRAGE, so I could replay them moving the camera to find the better point of view.
      Is there any way to link frames in MIRAGE windows with frames in AIBO simulated camera? That is, I can record on-board AIBO images with a frame number, and, I would like to relate this frame numbers with the view in MIRAGE window. perhaps passing
      frame info from the simulator to MIRAGE and printing it at a corner?

      I would like also add some other useful (i think) features to mirage, as a vector arrow which could point to the planned movement direction; i think this should involve again passing information from simulator to MIRAGE, has anybody any information
      about MIRAGE implementation or how could I try to do this?

      THANKS a lot in advance!
      Ignacio

      Ignacio Herrero Reder / Tl. +34-95.213.71.60
      Dpto. Tecnologia Electronica / Fax: +34-95.213.14.47
      E.T.S. Ing. Telecomunicacion / nhr@...
      Universidad de Malaga / http://www.dte.uma.es
      Campus Universitario de Teatinos
      29010 Malaga, Spain


    • Ignacio Herrero Reder
      Hello again. I ve found another ¿bug? in MIRAGE. It is related to collision Model: it is supposed that H key toggles between GRAPHICS_MODEL_MASK and
      Message 2 of 7 , Jul 17 5:49 AM
      • 0 Attachment
        Hello again. I've found another ¿bug? in MIRAGE. It is related to collision Model: it is supposed that "H" key toggles between GRAPHICS_MODEL_MASK and COLLISION_MODEL_MASK, but COLLISION_MODEL_MASK is always ON for robots (clients).
        Is ther any way to fix this?
        Thank you

        (working with Tekkotsu 5.03cvs (STABLE))


        Ignacio Herrero Reder / Tl. +34-95.213.71.60
        Dpto. Tecnologia Electronica / Fax: +34-95.213.14.47
        E.T.S. Ing. Telecomunicacion / nhr@...
        Universidad de Malaga / http://www.dte.uma.es
        Campus Universitario de Teatinos
        29010 Malaga, Spain
      • Ignacio Herrero Reder
        Hello! I m interested in adding some autolocalization features to my AIBO robot. Could it be possible using VisualRoutinesBehavior (and its ParticleFilter) to
        Message 3 of 7 , Sep 26, 2013
        • 0 Attachment
          Hello! I'm interested in adding some autolocalization features to my AIBO robot. Could it be possible using VisualRoutinesBehavior (and its ParticleFilter) to achieve this?
          At now my own behavior (I'm not using nodes) is a subclass of BehaviorBase...should I instance it to VisualRoutinesBehavior instead?
          Thanks in advance
          Ignacio

          Ignacio Herrero Reder / Tl. +34-95.213.71.60
          Dpto. Tecnologia Electronica / Fax: +34-95.213.14.47
          E.T.S. Ing. Telecomunicacion / nhr@...
          Universidad de Malaga / http://www.dte.uma.es
          Campus Universitario de Teatinos
          29010 Malaga, Spain
        • Dave Touretzky
          ... Yes, the ShapeBasedParticleFilter class will do localization. It is used by the Pilot to handle localization. ... Yes, you ll need to do that to use the
          Message 4 of 7 , Sep 26, 2013
          • 0 Attachment
            > Hello! I'm interested in adding some autolocalization features to
            > my AIBO robot. Could it be possible using VisualRoutinesBehavior
            > (and its ParticleFilter) to achieve this?

            Yes, the ShapeBasedParticleFilter class will do localization. It is
            used by the Pilot to handle localization.

            > At now my own behavior (I'm not using nodes) is a subclass of
            > BehaviorBase...should I instance it to VisualRoutinesBehavior
            > instead?

            Yes, you'll need to do that to use the particle filter.

            But if you're not using nodes at all, then you're missing out on a lot
            of the goodness of Tekkotsu, such as the MapBuilder, which handles
            vision processing, and the Pilot, which does localization and
            navigation.

            I don't know how well these things will run on an AIBO these days; we no
            longer support that platform. But it should be easy to compile some demo
            programs and see. Try the labs on the Tekkotsu wiki:

            http://wiki.tekkotsu.org/index.php/Labs

            -- Dave Touretzky
          • Ignacio Herrero Reder
            Thanks for your help, Dave!! Unfortunately when I started working with Tekkotsu there wasn t so many amazing features such you have developed in the last
            Message 5 of 7 , Sep 27, 2013
            • 0 Attachment
              Thanks for your help, Dave!! Unfortunately when I started working with Tekkotsu there wasn't so many amazing features such you have developed in the last years!! So I  programmed a (fair) simple behavior to just receive movement commands (both head and movement) and send images to the PC. I have also deactivated the on-board segmentation part as it was to slow (on the AIBO) for real time operation when you are considering more than 1-2 colors. At now I just send the plain image and do al the processing stuff at the PC.
              I'm using your same segmentation algorithm (CMvision) but it is executed at the PC as a JNI-C module. I have also changed the image headers so I send together all the data I need (image pixels, robot velocity, head position), due to synchronization issues (when I was using different sockets the data is not often synchronized and I'd to do a lot of things to synchronize it again).
              I don't know if could it be possible to adapt my behavior structure to Nodes, as I'm doing a lot of things off-board, but I'll give a try when I've some time as I thing it is surely worth.

              Regards
                 Ignacio

              Ignacio Herrero Reder            / Tl. +34-95.213.71.60
              Dpto. Tecnologia Electronica     / Fax: +34-95.213.14.47 
              E.T.S. Ing. Telecomunicacion     / nhr@... 
              Universidad de Malaga            / http://www.dte.uma.es
              Campus Universitario de Teatinos 
              29010 Malaga, Spain  
              El 27/09/2013 7:04, Dave Touretzky escribió:
               

              > Hello! I'm interested in adding some autolocalization features to
              > my AIBO robot. Could it be possible using VisualRoutinesBehavior
              > (and its ParticleFilter) to achieve this?

              Yes, the ShapeBasedParticleFilter class will do localization. It is
              used by the Pilot to handle localization.

              > At now my own behavior (I'm not using nodes) is a subclass of
              > BehaviorBase...should I instance it to VisualRoutinesBehavior
              > instead?

              Yes, you'll need to do that to use the particle filter.

              But if you're not using nodes at all, then you're missing out on a lot
              of the goodness of Tekkotsu, such as the MapBuilder, which handles
              vision processing, and the Pilot, which does localization and
              navigation.

              I don't know how well these things will run on an AIBO these days; we no
              longer support that platform. But it should be easy to compile some demo
              programs and see. Try the labs on the Tekkotsu wiki:

              http://wiki.tekkotsu.org/index.php/Labs

              -- Dave Touretzky


            Your message has been successfully submitted and would be delivered to recipients shortly.