Loading ...
Sorry, an error occurred while loading the content.

Re: Desired - A Uniform Platform for Robotics

Expand Messages
  • dan michaels
    ... peripheral nervous system. Arguments abound over how much is processed where... but the trend is shifting more toward the periphery. Like I said, I learn
    Message 1 of 361 , Dec 31, 2008
    • 0 Attachment
      --- In SeattleRobotics@yahoogroups.com, "Matthew Tedder"
      <matthewct@...> wrote:
      >
      > Well, more recent research is finging more of more of the necessary
      > procedural memory/processing to be in the spinal cord and the
      peripheral nervous system. Arguments abound over how much is
      processed where... but the trend is shifting more toward the
      periphery. Like I said, I learn more toward the theory of embodiment
      (the brain is a support subsystem of the
      > periphery, and not the other way around).
      >


      "shifting" ... not likely. Periphery has a few 10s of millions of
      neurons, CNS has 100 billion. No one that I've ever seen really
      doubts but that the real meat of processing takes place in the CNS,
      where the vast majority of neurons are.

      The idea of embodiment, as put forward by Rolf Pfeiffer and cohorts
      for instance, is not saying processing is shifting towards the
      periphery, it is saying that certain forms of innate "mechanical"
      processing take place at the periphery, due to the "physical" nature
      of the skeleton/tendons/musculature/etc, which effectively reduces
      the amount of processing needing to be done centrally.

      Eg, the 5 fingers are aligned in a linear fashion on the hand and can
      only bend certain ways, and therefore that reduces the #DOF that need
      to be controlled in the finger subsystem. However, the fingers still
      don't know how to type a sentence by themselves.




      > Lately, I've been reading some of the work of David Rector, about
      how patterns of oscillation can be used to address signals to
      different parts of the nervous system. Music is a good example.
      Perhaps a technique for
      > mapping functionality can be developed from this kind of research.
      >
      > Matthew
      >
      > On Wed, Dec 31, 2008 at 11:42 AM, dan michaels <oric_dan@...> wrote:
      >
      > > --- In SeattleRobotics@yahoogroups.com<SeattleRobotics%
      40yahoogroups.com>,
      > > "Matthew Tedder"
      > > <matthewct@> wrote:
      > > >
      > > > Actually, I meant to emphasize sensors. That's I mentioned
      gyros,
      > > > stepper/server motors with feedback, stereo cameras, sound
      in/out,
      > > and pressure sensors.
      > > >
      > > > Work on walking robots began in the 1970's but they never really
      > > worked well until rich sensory feedback was introduced. The
      > > algorithms didn't matter nearly so much as the real-time feedback.
      > > In fact, it's been discovered that the basic skill of walking, in
      > > humans, is coded into the nervous system> of the legs themselves
      with
      > > only communication links with the brain. Many people have noticed
      > > their fingers know a lot of things their head don't--like how to
      > > spell words or phone numbers of friends. I am very much
      > > > a fan of the theory of Embodiment, that our intelligence is
      really
      > > centered more in the peripheral nervous system with the central
      > > nervous system (aka brain / spinal cord) as more of a supporting
      > > subsystem. But really, there are no clear lines.
      > > >
      > >
      > > Ehh!
      > >
      > > This is somewhat of a gross over-simplification. Your fingers
      don't
      > > actually know anything at all about "spelling words", for
      instance.
      > > Functions such as typing are actually controlled from the higher
      > > levels of the brain and cerebellum, except often below the level
      of
      > > consciousness.
      > >
      > > So, what you're really talking about are brain functions that
      occur
      > > within or outside the level of conscious control here, and not
      > > between central and peripheral nervous system. They are not the
      same
      > > thing.
      > >
      > > Also, your legs don't know how to walk. Your spinal cord has some
      > > built-in reflexes that can conduct rough oscillatory movements,
      such
      > > as gross crawling movements, but again, control of anything beyond
      > > that is controlled by the central nervous system, albeit again,
      much
      > > of it below the level of consciousness again.
      > >
      > > OTOH, it's correct that feedback is what makes the nervous system
      > > work as well as it does. Every single behavior is constantly
      > > monitored by sensory feedback, both internal (eg, via
      proprioception)
      > > and external (eg, via vision). Even ballistic movements like
      shooting
      > > out an arm can be modified during the occurrence as a result of
      > > feedback.
      > >
      > > >
      > > >If there is anything we "are", it's out whole nervous
      > > > system. A neuron is the first cell the fully specialize in the
      > > fetus and> our neurons (generally) last our entire lives. They
      are,
      > > us--the homunculus> in your head.
      > > >
      > > > Many people here talk about how AI has been such a failure, and
      I
      > > certainly> don't argue against that point. But even those old,
      > > highly invalid neural> nets and perceptrons of the 1980s are
      suddenly
      > > much better at what they do> with merely the introduction of more
      > > real-time, feedback.
      > > >
      > >
      > > Well, this is certainly true.
      > >
      > > >
      > > > The hardware platform also needs to provide standardized
      componants
      > > for> building the frames. I suggest sticks with ends that connect
      in
      > > pyramidal> shapes. As for skin, the sticks should have points for
      > > securing strings> (for stretching cloth or rubber over them) or
      > > clampable areas for rigid> skins. I think two sizes of these
      should
      > > exist--large and small, where the> small can connect to the large.
      > > Also, each triangular area would act as a> module socket for
      > > specialized componants--anything from limb joints to> camera
      mounts
      > > or custom designed peripherals.
      > > >
      > >
      > > Interesting ideas.
      > >
      > >
      > >
      >
      >
      > [Non-text portions of this message have been removed]
      >
    • yhmmc
      The Answer is simple. Build a Leaf Robot. for more information, go to www.Leafprojects.org
      Message 361 of 361 , Jul 18, 2011
      • 0 Attachment
        The Answer is simple. Build a Leaf Robot.

        for more information, go to www.Leafprojects.org

        --- In SeattleRobotics@yahoogroups.com, "Alan Marconett" <KM6VV@...> wrote:
        >
        > I'm not quite sure that's it. Good discussion, 'tho.
        >
        >
        >
        > -----Original Message-----
        > On Behalf Of David Wyland
        >
        >
        >
        > *BINGO*
        >
        >
        >
        > "...the real reason we don't have intelligent robots is the same
        >
        > reason we don't have intelligent AIs, and that is because the
        >
        > algorithms are deficient ..."
        >
        >
        >
        > I'll go along with that. We can't react to what we can barely see (sense).
        >
        >
        >
        > In think in the hardware vs software discussion, we are suffering from
        >
        > hardening of the categories.
        >
        >
        >
        > It shouldn't be "hardware vs. software". They are complementary (along with
        > mechanicals), and each have tasks that they are best at. Some overlap is
        > bound (and desirable) to happen.
        >
        >
        >
        > The problem is that we do not have good algorithms for making robots
        >
        > that work. We tend to think of algorithms as math implemented in
        >
        > software. But an algorithm is a method for doing something, not just
        >
        > an equation or a line of code. The key thing about an algorithm is not
        >
        > how it is implemented but how well it solves the problem it addresses.
        >
        >
        >
        > Yes, an algorithm must solve the problem.
        >
        >
        >
        > I think we got a little lost. Our algorithms will need to be about the
        >
        > physics of motion, the physical motion of the robot as it physically
        >
        > interacts with its dynamic environment. The physical environment is
        >
        > the design area for the algorithms. They will be implemented in a
        >
        > complex combination of hardware and software.
        >
        >
        >
        > Here I tend to disagree a little bit. I think we know quite well how to do
        > a lot of motion. We have hexapod robots! They can move and translate all
        > over the place! A little IK does the trick. Robotic arms as well. And
        > we're not doing too bad with vehicles (DARPA).
        >
        >
        >
        > While we can't do ballet or gymnastics quite yet, because we might need more
        > work on really dynamic movement (possibly what you're saying), we can
        > shuffle around pretty well. But we can't know WHERE to move if we're not
        > yet really understanding where we are, where to move to, and what to avoid.
        > It's the greater understanding and proliferation of sensors that's required.
        >
        >
        >
        >
        > Unfortunately, common programming languages do not have built-in
        >
        > abstractions for motion. They do not even have an abstraction for
        >
        > time, only sequence. And motion is critically about change in position
        >
        > and velocity of things with respect to time.
        >
        >
        >
        > I'm probably not following you here. I don't think we need any "built-in
        > abstractions" for motion. We have motion-control libraries (programs) Look
        > at CNC and contouring. Define waypoints for the travel needed, and work the
        > paths through those points. True, we don't fold in sensory data. We don't
        > have enough sensory data, let alone know how to use it well enough yet.
        >
        >
        >
        > Lack of fundamental abstractions about motion and time, similar to the
        >
        > abstractions of number and arithmetic operations, may be the missing
        >
        > software link to bridge from current languages to robot languages of
        >
        > motion.
        >
        >
        >
        > Not following you here.
        >
        >
        >
        > The hardware folks (including me) may bear down a little too hard on
        >
        > the hardware side because the problem is centered on the physics of
        >
        > motion of the robot in its environment. But this is just a point of
        >
        > view. The physics of motion is about using mathematics to describe the
        >
        > movement of things. It is a *union* of math and measurement.
        >
        >
        >
        > The mechanicals of a robot require the physics and rely on it for any
        > motion. The (computer) hardware must drive the nuts and bolts. I think I'm
        > with you here.
        >
        >
        >
        > I think we should say we have three faces of "ware". Hardware, Software and
        > Mechanical. Add in electrical/power if need be.
        >
        >
        >
        > We implement our robot designs by a combination of keystrokes and
        >
        > mouse movements. It all looks like software. Even hard core digital
        >
        > hardware design uses Verilog or VHDL, text based design languages that
        >
        > are compiled and debugged, a lot like C. So the hardware and software
        >
        > folks are more alike than not.
        >
        >
        >
        > These are just tools. For hardware, one could be wire-wrapping a bit of
        > logic, or bread-boarding a circuit at an early stage. For any "programmed"
        > logic part, PAL, PLA, FPGL, etc. ya gotta program it at some time! And that
        > involves typing the code in. Maybe draw it out on paper, scan it in, and a
        > "schematic capture" program could work from your drawing? It is much easier
        > to edit a line of code then to rip up wiring and start over. Enjoy it!
        >
        >
        >
        > Having this "computer compatible" input allows rule-checks etc. Nice stuff
        > to have! And yes, the differences between hardware and software designs
        > (and tools) are blurring.
        >
        >
        >
        > We are ultimately doing firmware, a different kind of software. In
        >
        > conventional software, the output is something that a human looks at,
        >
        > whether text or pictures. The output of firmware is something that
        >
        > moves and makes a noise in the physical world, such as a microwave
        >
        > oven or an automobile engine. The kind of code you write and how you
        >
        > debug it is different as a result.
        >
        >
        >
        > I'd say it's the platform of the embedded product/project is different, and
        > must closely handle I/O and interrupts, often with limited program and data
        > size that makes the task different from "PC" (CS) programming.
        >
        >
        >
        > Phidgits provides a nice selection of USB devices that allow a PC to
        >
        > sense and drive most of the robot and sensor hardware out there. So
        >
        > that is not the problem. The problem is that it is very hard to go
        >
        > beyond a simple, 2-wheel+caster robot.
        >
        >
        >
        > These look interesting. But an experienced (or cheap) robotics hobbyist can
        > often do them him/her self.
        >
        >
        >
        > Maybe we need more a more advanced set of "building blocks". Wayne's Robo
        > bricks?
        >
        >
        >
        > Over 90,000 people have purchased the Parallax BOE-Bot. It is a simple
        >
        > table-top robot that runs Basic in an on-board processor called a
        >
        > Stamp. Almost everyone who has bought one has made it run around on a
        >
        > table top using a simple program. Parallax has excellent documentation
        >
        > that helps you bring up the robot and make it do a few things.
        >
        >
        >
        > That's fantastic! A lot of thought and energy went into that design.
        >
        >
        >
        > The BOE-Bot has a hardware patch board and a software library that
        >
        > makes it easy to add sensors to the robot. They even take you through
        >
        > adding bump sensors, LEDs and a speaker.
        >
        >
        >
        > Perhaps we can propagate that into bigger and more complex robot designs.
        > Let's go to what, an ARM7 or ARM9 etc, and add some motion-control
        > libraries.
        >
        >
        >
        > But there is where you hit the wall. How do we design a combination of
        >
        > sensors and software to do something interesting, to go beyond running
        >
        > around the table bumping into things? The particular blend of hardware
        >
        > and software is not important: what it does, is.
        >
        >
        >
        > BOE-Bot has its place. A bigger, more ambitious robot platform will be
        > required to do more! What's the next universal platform? Topo? Heathkit's
        > 'bot (both OLD)? What's out there that we like?
        >
        >
        >
        > What do you want on it? . You want sensors with that?
        >
        >
        >
        > Even the iRobot Roomba is not much different from the BOE-Bot. It runs
        >
        > around the floor, changing direction when it bumps into things. And
        >
        > they have a lot of money to spend and a lot of university connections
        >
        > to help out. Rodney Brooks of the MIT AI Lab was one of the founders.
        >
        >
        >
        > But it has a task to do, and goes about it. What's the next application
        > that can be addressed, lawn mowing?
        >
        >
        >
        > Someday, it will all be obvious in the past tense, but as a farmer
        >
        > friend once said, "You can't come back from where you haven't been."
        >
        >
        >
        > In the mean time, we need much better algorithms for robot motion -
        >
        > for robot ballet - in a dynamic, always-incompletely-known,
        >
        > unpredictable environment. Detecting the edge of the table so the
        >
        > BOE-Bot does not fall off is just the first step.
        >
        >
        >
        > That might take a while. I think we need to limit the scope, at least until
        > we get things going. A security robot for my house (+yard?) would be a good
        > start. I don't need it to pick up the kids toys (grown).
        >
        >
        >
        > Good luck to us all. May we enthusiastically blend our various skills
        >
        > to build better robots in the coming year!
        >
        >
        >
        > Dave Wyland
        >
        >
        >
        > Yes, certainly! We need to identify individual skills, and develop
        > individual goals. An exchange of knowledge and ideas is paramount!
        >
        >
        >
        > Best wishes,
        >
        >
        >
        > Alan KM6VV
        >
        >
        >
        >
        >
        > [Non-text portions of this message have been removed]
        >
      Your message has been successfully submitted and would be delivered to recipients shortly.