I want to generate a collision-free path with OMPL and ik on my UR5 robot, so I convert some code from scenes/pathPlanning/path_planing_and_grasp.ttt.
But in my code, it is slow and difficult to compute a valid goal config. Even the code can find a goal config, the path generates by simOMPL will collision with the floor. And how should I check the self-collision in UR5, when I set the collisionParis = { collection, collection }, the code can't find any solution. Could you help me to check the code? The main function is generatePath() in UR5 script. Here is the scene file:
https://github.com/Juzhan/UR5_test
By the way, if the robot grasps a box with a gripper, how should I set the box to be a part of the robot for collision-free planning?
Thanks in advance!
Use OMPL for UR5
Re: Use OMPL for UR5
Hello,
it appears that your robot is in constant self-collision state. Try this:
and you will quickly see what is in collision. You can also check the color of the colliding objects:
Cheers
it appears that your robot is in constant self-collision state. Try this:
Code: Select all
function sysCall_init()
robot = sim.getObjectHandle(sim.handle_self)
collection=sim.createCollection(0)
sim.addItemToCollection(collection, sim.handle_tree, robot, 0)
end
function sysCall_sensing()
-- Check for self-collision:
local res, collidingObjects=sim.checkCollision(collection,collection)
if res>0 then
print(string.format("Self-collision: Objects %s and %s",sim.getObjectName(collidingObjects[1]),sim.getObjectName(collidingObjects[2])))
end
-- Check collision with environment:
local res, collidingObjects=sim.checkCollision(collection,sim.handle_all)
if res>0 then
print(string.format("Robot-environment collision: %s and %s",sim.getObjectName(collidingObjects[1]),sim.getObjectName(collidingObjects[2])))
end
end
Code: Select all
function changePairColor(entityPair,colorPair)
originalColorData={}
originalColorData[1]=sim.changeEntityColor(entityPair[1],colorPair[1])
originalColorData[2]=sim.changeEntityColor(entityPair[2],colorPair[2])
end
function restorePairColor()
if originalColorData then
sim.restoreEntityColor(originalColorData[1])
sim.restoreEntityColor(originalColorData[2])
originalColorData=nil
end
end
function sysCall_init()
robot = sim.getObjectHandle(sim.handle_self)
collection=sim.createCollection(0)
sim.addItemToCollection(collection, sim.handle_tree, robot, 0)
collisionColors={{1,0,0},{1,0,1}} -- collider and collidee
end
function sysCall_cleanup()
restorePairColor()
end
function sysCall_sensing()
-- Check for self-collision:
local res, collidingObjects=sim.checkCollision(collection,collection)
restorePairColor()
if res>0 then
print(string.format("Self-collision: Objects %s and %s",sim.getObjectName(collidingObjects[1]),sim.getObjectName(collidingObjects[2])))
changePairColor({collidingObjects[1],collidingObjects[2]},collisionColors)
else
-- Check collision with environment:
local res, collidingObjects=sim.checkCollision(collection,sim.handle_all)
if res>0 then
print(string.format("Robot-environment collision: %s and %s",sim.getObjectName(collidingObjects[1]),sim.getObjectName(collidingObjects[2])))
changePairColor({collection,collidingObjects[2]},collisionColors)
end
end
end
Re: Use OMPL for UR5
Thanks! I adjust the size of some parts of the UR5 model, now I can find the path when collisionParis = { collection, collection }. But I use the UR5 model directly from the Model browser, maybe you can modify the UR5 model in the next software update to avoid self-collision.
Re: Use OMPL for UR5
Instead of adjusting the model itself, you could have just adjusted the Collection self-collision indicator.
Cheers
Cheers
Re: Use OMPL for UR5
Thanks a lot!
And I still have 2 questions.
1. UR5 robot has an initial child script, which contains a function coroutineMain(). And I also wrote a path planning and execution function in the child script.
If I want to use the path planning and execution function, I have two methods to do this:
(a) use the function directly in coroutineMain,
(b) use the function via Python remoteAPI.
Method (a) can generate a stable movement when executing the path but the path planning is much slower than method (b). Method (b) can plan the path very quickly but the execution is too fast even cause some swing of the robot arm. I try to reduce the maxVel, maxAcc, and maxJerk but they didn't work.
What might have happened when I use these two methods? I think both of these methods should perform the same on planning and on execution.
2. I am working on the bin packing problem, I put some closely placed objects in a scene. After I grasp an object, I add the grasped object into the robot collection via sim.addItemCollection for collision detection. But the object is in contact with other objects, so the grasped object is a collision with the scene and I can't plan a path moving with the grasped object. Is there any way to avoid this problem, like adjust the detection-sensing or collision volume?
Cheers
And I still have 2 questions.
1. UR5 robot has an initial child script, which contains a function coroutineMain(). And I also wrote a path planning and execution function in the child script.
If I want to use the path planning and execution function, I have two methods to do this:
(a) use the function directly in coroutineMain,
(b) use the function via Python remoteAPI.
Method (a) can generate a stable movement when executing the path but the path planning is much slower than method (b). Method (b) can plan the path very quickly but the execution is too fast even cause some swing of the robot arm. I try to reduce the maxVel, maxAcc, and maxJerk but they didn't work.
What might have happened when I use these two methods? I think both of these methods should perform the same on planning and on execution.
2. I am working on the bin packing problem, I put some closely placed objects in a scene. After I grasp an object, I add the grasped object into the robot collection via sim.addItemCollection for collision detection. But the object is in contact with other objects, so the grasped object is a collision with the scene and I can't plan a path moving with the grasped object. Is there any way to avoid this problem, like adjust the detection-sensing or collision volume?
Cheers
Re: Use OMPL for UR5
If you run some code in a coroutine/thread, then that code can often be interrupted (since it is a thread/coroutine!). To avoid that, you can have a section that forbids automatic thread switches:
Now, if you try to execute a movement from a remote API client, or from any other external application, then simply trigger the execution from there, but do not directly execute the movement. Let the thread handle this. e.g. here an extremely simple example:
For a more elaborate example, have a look at the demo scene scenes/messaging/movementViaRemoteApi.ttt (that scene runs with a client Python application you can find in the programming/remoteApiBindings, programming/b0RemoteApiBindings or programming/zmqRemoteApi folders.
About your second question: maybe simply do not add the object to the collection? You can define many different collections, e.g. your robot, the environment, the environment minus the part to grasp, etc. Then simply check the appropriate collision pairs (e.g. robot collection vs environment collection).
Cheers
Code: Select all
local l=sim.setThreadAutomaticSwitch(false) -- temporarily forbid automatic thread switching
...
-- execute here some task without interruption
...
sim.setThreadAutomaticSwitch(l) -- restore the original automatic thread switching
Code: Select all
...
function legacyRemoteApi_triggerMovement(intData,floatData,stringData,buffer)
-- prepare the moevement data here, e.g.
movementData={}
movementData.goalPosition=floatData
return {},{},{},''
end
function coroutineStart()
while true do
if movementData then
-- execute the movement here
movementData=nil
end
end
end
About your second question: maybe simply do not add the object to the collection? You can define many different collections, e.g. your robot, the environment, the environment minus the part to grasp, etc. Then simply check the appropriate collision pairs (e.g. robot collection vs environment collection).
Cheers