Re: KHR-2 3d models
- We looked into the IKFast solution you mentioned before and that seems like the best choice of action for our project. Your suggestion below to avoid IKFast would work in the short term, but we are actually planning on expanding our project to more than just humanoid robots and having a quick easy program which can create IK solvers by being given a URDF model of a robot would be pretty useful.
Thus far how I understand the IKFast python script is we feed it a URDF xml description of our robot along with what joints we want it to generate solvers for and then the script generates c++ code to solve for those joints. If this is the case, would it be simple enough to include this generated file into tekkotsu? Just have a #include <Motion/KHR2IKFile.cpp> which overloaded a function call or something?
Have you put Tekkotsu onto a gumstix? The gumstix overo earth is the platform we are planning on using and we wonder if there would be any performance hits. However, considering tekkotsu ran onboard an Aibo, which I believe is less powerful than an overo, I wouldn't expect too much of a draw back. Do you have any input or intuition on the matter? Thanks!
--- In email@example.com, Ethan Tira-Thompson <ejt@...> wrote:
> A few updates:
> I'm mostly finished with simplified KHR model and kinematics, they've actually been almost done for a while now but I've had a few other things to work on. As soon as I get a minute I'll finish that up and let you know when it's checked in.
> But I've also been meaning to suggest an idea I had for the leg IK... instead of doing the ikfast thing, we might be able to simply the problem: break it into two pieces, where you solve for the position of the ankle, and then separately solve for the orientation of the foot to keep it level. I think these two pieces could each be solved analytically by IKThreeLink. Solving for the ankle position is straightforward, just specify the first ankle joint offset instead of the foot frame offset. Then the two ankle joints are fairly straightforward... IKThreeLink doesn't do orientation solution, so you could either solve for a foot position directly below the ankle position, or you could do a bit of math and code a specialized solution for this case based on the angles involved, which would probably be a little more efficient.
> (If using the IKThreeLink to solve just the two ankle joints, it will try to use a third joint if it sees one, so have to mark the knee and hip joints immobile by setting their min and max joint values to 0. So either use two KinematicJoint chains with the joint ranges set up for doing each solution, or use a single chain and toggle them back and forth when switching which joints you are solving for.)
> On Oct 11, 2010, at 4:43 PM, redbaron148 wrote:
> > We got the rights to distribute the 3d models, or at least we will very shortly, do you need me to cut them down and simplify them for the mirage simulator?
> > Also, has any work been done on the inverse kinematics of the KHR-2? Or did we decide it was something wrong with the way we defined the xml description?
> > -- Aaron
- So, first off, I’ve checked in my simplified KHR2 models and kinematics definition! You might still have some tweaking to do, especially if I accidentally inverted any of the joint rotation axes. Give it a spin :)
I used the dhcalc tool for this as well, so you might want to check out tools/dhcalc/KHR2.dhcalc... dhcalc converts from world-frame coordinates of each joint drawn from the 3D model to the corresponding DH parameters, and I added a feature to allow you to specify the collision shapes as well instead of having to edit the dhcalc supplied models afterward.
> Thus far how I understand the IKFast python script is we feed it a URDF xml description of our robot along with what joints we want it to generate solvers for and then the script generates c++ code to solve for those joints. If this is the case, would it be simple enough to include this generated file into tekkotsu?Yes, from what I’ve seen so far, I think this should be relatively straightforward. You would need a minor implementation of Tekkotsu IKSolver class which forwards calls onto the functions exported by ikfast.
There are a few mismatches though, for instance the Tekkotsu solver allows you to solve for non-origin points (so you will need to backtrace the target point by the offset at the target orientation to get the target origin location.) Also, Tekkotsu also allows you to request solutions along target surfaces, not just points, but in common usage you’ll be happy just handling point solutions... that’s all the current analytic solvers can do anyway, only the gradient descent solver supports non-point targets...
I’m not sure what options, if any, ikfast provides for controlling the name of its functions... I notice for example the PR2 output uses “IKSolver” for each of the limbs, so you may need to wrap them in a namespaces or be a little careful in your linkage to avoid name conflicts, both between each chain as well as with the main Tekkotsu “IKSolver” class. (you could just forward declare a few function prototypes in a header file used by your Tekkotsu IKSolver subclass and thus prevent the two namespaces from directly seeing each other.)
You could also create a utility to convert between the Tekkotsu .kin format and URDF, probably faster in the long run than trying to do the conversion by hand. I think Tekkotsu to URDF is the most straightforward, as Tekkotsu uses minimalist DH notation (e.g. 4 parameters, rotation always about z) whereas URDF provides a superset which may not map back cleanly (e.g. URDF could specify an arbitrary joint rotation axis, not just about ‘z')
> Have you put Tekkotsu onto a gumstix? The gumstix overo earth is the platform we are planning on using and we wonder if there would be any performance hits. However, considering tekkotsu ran onboard an Aibo, which I believe is less powerful than an overo, I wouldn't expect too much of a draw back.We started out looking at an older model gumstix but we weren’t too confident in its FPU performance... but more importantly the development toolchain was such a PITA to setup, it would not be suitable for a larger community. It’s been several years though, the newer boards look plenty fast, I don’t think you’ll have trouble performance-wise for basic tasks (vision obviously being the 900 pound gorilla if you want to do anything fancy, but our common color-segmentation processing shouldn’t be an issue.) I’ll be curious if the toolchain is better supported now :)