Loading ...
Sorry, an error occurred while loading the content.

Re: [SeattleRobotics] Re: Desired - A Uniform Platform for Robotics

Expand Messages
  • Peter Balch
    Happy New Year. ... IMHO, the Periphery is a philosophical place, not neccessarily a physical one. It doesn t have to mean outside the CNS. One of Brookes s
    Message 1 of 361 , Jan 1, 2009
    View Source
    • 0 Attachment
      Happy New Year.

      > "shifting" ... not likely. Periphery has a few 10s of millions of
      > neurons, CNS has 100 billion. No one that I've ever seen really
      > doubts but that the real meat of processing takes place in the CNS,
      > where the vast majority of neurons are.

      IMHO, the "Periphery" is a philosophical place, not neccessarily a physical
      one. It doesn't have to mean outside the CNS.

      One of Brookes's most useful diatribes was against the "classical AI" idea
      that all processing should be done in a single "brain" which contains a
      model of the world. All sensory input goes into the model and affects it in
      some way; a demon then looks at the model and decides what to do next. There
      are feedback loops but they all go through the model.

      Instead, Brookes argued that feedback loops should be as short as possible.
      A peripheral loop would send to the central modeller a simplified,
      concentrated description of what it was doing. The central modeller would
      tell the peripheral loop what its current parameters should be.

      Robots should react to inputs at the "lowest" (i.e. most peripheral) level
      possible.

      Even though we can all see deficiencies in subsumption, I think we would all
      design robots or animals in that way.

      So peripheral vs. central doesn't have to equate to peripheral nervous
      system vs. central nervous system.

      Peter
    • yhmmc
      The Answer is simple. Build a Leaf Robot. for more information, go to www.Leafprojects.org
      Message 361 of 361 , Jul 18, 2011
      View Source
      • 0 Attachment
        The Answer is simple. Build a Leaf Robot.

        for more information, go to www.Leafprojects.org

        --- In SeattleRobotics@yahoogroups.com, "Alan Marconett" <KM6VV@...> wrote:
        >
        > I'm not quite sure that's it. Good discussion, 'tho.
        >
        >
        >
        > -----Original Message-----
        > On Behalf Of David Wyland
        >
        >
        >
        > *BINGO*
        >
        >
        >
        > "...the real reason we don't have intelligent robots is the same
        >
        > reason we don't have intelligent AIs, and that is because the
        >
        > algorithms are deficient ..."
        >
        >
        >
        > I'll go along with that. We can't react to what we can barely see (sense).
        >
        >
        >
        > In think in the hardware vs software discussion, we are suffering from
        >
        > hardening of the categories.
        >
        >
        >
        > It shouldn't be "hardware vs. software". They are complementary (along with
        > mechanicals), and each have tasks that they are best at. Some overlap is
        > bound (and desirable) to happen.
        >
        >
        >
        > The problem is that we do not have good algorithms for making robots
        >
        > that work. We tend to think of algorithms as math implemented in
        >
        > software. But an algorithm is a method for doing something, not just
        >
        > an equation or a line of code. The key thing about an algorithm is not
        >
        > how it is implemented but how well it solves the problem it addresses.
        >
        >
        >
        > Yes, an algorithm must solve the problem.
        >
        >
        >
        > I think we got a little lost. Our algorithms will need to be about the
        >
        > physics of motion, the physical motion of the robot as it physically
        >
        > interacts with its dynamic environment. The physical environment is
        >
        > the design area for the algorithms. They will be implemented in a
        >
        > complex combination of hardware and software.
        >
        >
        >
        > Here I tend to disagree a little bit. I think we know quite well how to do
        > a lot of motion. We have hexapod robots! They can move and translate all
        > over the place! A little IK does the trick. Robotic arms as well. And
        > we're not doing too bad with vehicles (DARPA).
        >
        >
        >
        > While we can't do ballet or gymnastics quite yet, because we might need more
        > work on really dynamic movement (possibly what you're saying), we can
        > shuffle around pretty well. But we can't know WHERE to move if we're not
        > yet really understanding where we are, where to move to, and what to avoid.
        > It's the greater understanding and proliferation of sensors that's required.
        >
        >
        >
        >
        > Unfortunately, common programming languages do not have built-in
        >
        > abstractions for motion. They do not even have an abstraction for
        >
        > time, only sequence. And motion is critically about change in position
        >
        > and velocity of things with respect to time.
        >
        >
        >
        > I'm probably not following you here. I don't think we need any "built-in
        > abstractions" for motion. We have motion-control libraries (programs) Look
        > at CNC and contouring. Define waypoints for the travel needed, and work the
        > paths through those points. True, we don't fold in sensory data. We don't
        > have enough sensory data, let alone know how to use it well enough yet.
        >
        >
        >
        > Lack of fundamental abstractions about motion and time, similar to the
        >
        > abstractions of number and arithmetic operations, may be the missing
        >
        > software link to bridge from current languages to robot languages of
        >
        > motion.
        >
        >
        >
        > Not following you here.
        >
        >
        >
        > The hardware folks (including me) may bear down a little too hard on
        >
        > the hardware side because the problem is centered on the physics of
        >
        > motion of the robot in its environment. But this is just a point of
        >
        > view. The physics of motion is about using mathematics to describe the
        >
        > movement of things. It is a *union* of math and measurement.
        >
        >
        >
        > The mechanicals of a robot require the physics and rely on it for any
        > motion. The (computer) hardware must drive the nuts and bolts. I think I'm
        > with you here.
        >
        >
        >
        > I think we should say we have three faces of "ware". Hardware, Software and
        > Mechanical. Add in electrical/power if need be.
        >
        >
        >
        > We implement our robot designs by a combination of keystrokes and
        >
        > mouse movements. It all looks like software. Even hard core digital
        >
        > hardware design uses Verilog or VHDL, text based design languages that
        >
        > are compiled and debugged, a lot like C. So the hardware and software
        >
        > folks are more alike than not.
        >
        >
        >
        > These are just tools. For hardware, one could be wire-wrapping a bit of
        > logic, or bread-boarding a circuit at an early stage. For any "programmed"
        > logic part, PAL, PLA, FPGL, etc. ya gotta program it at some time! And that
        > involves typing the code in. Maybe draw it out on paper, scan it in, and a
        > "schematic capture" program could work from your drawing? It is much easier
        > to edit a line of code then to rip up wiring and start over. Enjoy it!
        >
        >
        >
        > Having this "computer compatible" input allows rule-checks etc. Nice stuff
        > to have! And yes, the differences between hardware and software designs
        > (and tools) are blurring.
        >
        >
        >
        > We are ultimately doing firmware, a different kind of software. In
        >
        > conventional software, the output is something that a human looks at,
        >
        > whether text or pictures. The output of firmware is something that
        >
        > moves and makes a noise in the physical world, such as a microwave
        >
        > oven or an automobile engine. The kind of code you write and how you
        >
        > debug it is different as a result.
        >
        >
        >
        > I'd say it's the platform of the embedded product/project is different, and
        > must closely handle I/O and interrupts, often with limited program and data
        > size that makes the task different from "PC" (CS) programming.
        >
        >
        >
        > Phidgits provides a nice selection of USB devices that allow a PC to
        >
        > sense and drive most of the robot and sensor hardware out there. So
        >
        > that is not the problem. The problem is that it is very hard to go
        >
        > beyond a simple, 2-wheel+caster robot.
        >
        >
        >
        > These look interesting. But an experienced (or cheap) robotics hobbyist can
        > often do them him/her self.
        >
        >
        >
        > Maybe we need more a more advanced set of "building blocks". Wayne's Robo
        > bricks?
        >
        >
        >
        > Over 90,000 people have purchased the Parallax BOE-Bot. It is a simple
        >
        > table-top robot that runs Basic in an on-board processor called a
        >
        > Stamp. Almost everyone who has bought one has made it run around on a
        >
        > table top using a simple program. Parallax has excellent documentation
        >
        > that helps you bring up the robot and make it do a few things.
        >
        >
        >
        > That's fantastic! A lot of thought and energy went into that design.
        >
        >
        >
        > The BOE-Bot has a hardware patch board and a software library that
        >
        > makes it easy to add sensors to the robot. They even take you through
        >
        > adding bump sensors, LEDs and a speaker.
        >
        >
        >
        > Perhaps we can propagate that into bigger and more complex robot designs.
        > Let's go to what, an ARM7 or ARM9 etc, and add some motion-control
        > libraries.
        >
        >
        >
        > But there is where you hit the wall. How do we design a combination of
        >
        > sensors and software to do something interesting, to go beyond running
        >
        > around the table bumping into things? The particular blend of hardware
        >
        > and software is not important: what it does, is.
        >
        >
        >
        > BOE-Bot has its place. A bigger, more ambitious robot platform will be
        > required to do more! What's the next universal platform? Topo? Heathkit's
        > 'bot (both OLD)? What's out there that we like?
        >
        >
        >
        > What do you want on it? . You want sensors with that?
        >
        >
        >
        > Even the iRobot Roomba is not much different from the BOE-Bot. It runs
        >
        > around the floor, changing direction when it bumps into things. And
        >
        > they have a lot of money to spend and a lot of university connections
        >
        > to help out. Rodney Brooks of the MIT AI Lab was one of the founders.
        >
        >
        >
        > But it has a task to do, and goes about it. What's the next application
        > that can be addressed, lawn mowing?
        >
        >
        >
        > Someday, it will all be obvious in the past tense, but as a farmer
        >
        > friend once said, "You can't come back from where you haven't been."
        >
        >
        >
        > In the mean time, we need much better algorithms for robot motion -
        >
        > for robot ballet - in a dynamic, always-incompletely-known,
        >
        > unpredictable environment. Detecting the edge of the table so the
        >
        > BOE-Bot does not fall off is just the first step.
        >
        >
        >
        > That might take a while. I think we need to limit the scope, at least until
        > we get things going. A security robot for my house (+yard?) would be a good
        > start. I don't need it to pick up the kids toys (grown).
        >
        >
        >
        > Good luck to us all. May we enthusiastically blend our various skills
        >
        > to build better robots in the coming year!
        >
        >
        >
        > Dave Wyland
        >
        >
        >
        > Yes, certainly! We need to identify individual skills, and develop
        > individual goals. An exchange of knowledge and ideas is paramount!
        >
        >
        >
        > Best wishes,
        >
        >
        >
        > Alan KM6VV
        >
        >
        >
        >
        >
        > [Non-text portions of this message have been removed]
        >
      Your message has been successfully submitted and would be delivered to recipients shortly.