Loading ...
Sorry, an error occurred while loading the content.

61254Re: [beam] RE: New project under way

Expand Messages
  • Martin McKee
    Dec 31, 2013
      Not entirely alone!  Though I'm still far too rushed off my feet to do any sort of BEAM work besides the odd daydream while behind the wheel somewhere else.  The project sounds good to me, it really sounds very much like a horse-and-rider style of control system.  What I "might" do ( and I do mean might, I'd say it's equally probable that I wouldn't ) is to implement the same sort of system with a processor in place of the low-level BEAM "protective" system.  Of course, I've also considered the BEAM + processor approach as well and like it for several reasons.  Given that I have so much experience with programming, I would feel quite comfortable being able to write the firmware for the low-level control and feel that it would be able to be trusted in all cases ( even if that weren't the case with more complex control algorithms ).  Either way I suppose.  But I do like the structure of multi-layer control systems.  Simple systems connected to the hardware directly to handle reflexive actions and more complex systems stacked on top to influence the lower systems.  The modularity makes it much easier to ensure that basic behaviors are working before more difficult ones are tackled and, as you say, it allows one to protect the robot from programming flubs.

      Sounds interesting,
      Martin Jay McKee

      On Tue, Dec 31, 2013 at 7:27 PM, <connor_ramsey@...> wrote:

      Actually, I think I'll instead use the servo to rotate the ultrasonic sensor, since I actually have two GM7 gearmotors lying around already. So the final concept, I suppose, is a two-motor uniwalker style droid, with a third servomotor in the head. The body structure will probably consist of a metal frame built around the PCB, with the solar panel mounted on top of that. The sensory array should consist of: 1)an ultrasonic distance sensor(head-mounted), 2)two photodiode "eyes"(head-mounted), two wheel pots for angle measurement (motor-mounted), and two IR sensors(body-mounted) for surface detection.

      I still may implement an enslaved BEAM core behind each leg motor, as a reflexive fail-safe against any suicidal tendencies in the machine's programming, either accidental or intentional(I don't want to lose a $100 project). I guess the way the bot could sense danger would be to monitor the IR sensors. They will override the controller's commands in 2 cases: 1)a sensor detects nothing, meaning there is a drop in that direction, or 2)a sensor detects a higher IR level than the emitter outputs, meaning that there either a dangerous heat source to avoid, or too much ambient light to safely navigate in that direction. Also, if battery level becomes too low, the charging circuit switches control from the Arduino to a photo-bicore wired to the photodiodes in the head, which directs the bot toward the nearest light source, but can still be overridden just the same. The controller is only brought back online when the charge is full. The slave bicores also use the wheel pots on their motor as the ground resistor for each Nv; when the leg goes out of synch or gets stuck, the bicore responds and also communicates with the opposite core.

       Some of you might argue that none of this is necessary when I can just program the Arduino to perform all of these functions. But the purpose of implementing them in hardware is to prevent faulty coding from causing physical damage to the robot, in other words, to protect it from my coding noobishness. Also, the controller doesn't have to be made aware of physical problems; the body solves these issues on its own to ensure that the controller can carry out its goal. Not to say that the controller doesn't take input from these sources, it just doesn't need to be responsible for every iterative motion.

      A fun side note: the bot could potentially get "trapped" inside a shape drawn on the ground, like a line follower. Which is why I may add a white LED aside the IR emitter, controlled by the Arduino, so that bot can tell the difference between a dark surface and a hole, but only if the program activates the LED upon such an instance. Otherwise, the bot can be guided along by drawing a path to follow. So depending on the robot's goal, either is a valid option.

      Enjoy, Connor(I mean, to myself, since I'm obviously alone again...)

    • Show all 8 messages in this topic