How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Typically: "How do I... ", "How can I... " questions
Post Reply
SS_Seagull
Posts: 3
Joined: 18 May 2020, 01:18

How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Post by SS_Seagull »

Hello!

I am a beginner to the CoppeliaSim environment, and would like to simulate a robot picking up an object.

For this I am using an IRB140 robot with a gripper and a Kinect . The kinect acquires RGB and Depth information and feed it to a Neural Network which detects the object and returns a rectangular bounding box (currently working thanks to other posts on this forum)

I would like to use this bounding box to calculate the target point (center of the rectangle) and feed it to the robot, which would then appropriately orient itself to grab the object.

The issue here is how do i convert the target point from a 2D location on an image into world coordinates?? And how do i use the Python Remote API to communicate with the robot?

Thank you!

coppelia
Site Admin
Posts: 7838
Joined: 14 Dec 2012, 00:25

Re: How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Post by coppelia »

Hello,

have a look at the model in your model library at Models/components/sensors/Blob to 3D position.ttm. There you have the calculations that transform an image position (X/Y) into a world position (X/Y/Z).

Have also a look at the simple sample Python programs in programming/remoteApiBindings/python and/or programming/b0RemoteApiBindings/python for examples how to connect and interact with CoppeliaSim from Python.

Cheers

SS_Seagull
Posts: 3
Joined: 18 May 2020, 01:18

Re: How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Post by SS_Seagull »

Hello!

Thank you for the reply!

I was hoping you could explain how this line in the code exactly works and how I could change it to work with kinect since it has two handles kinect_depth and kinect_rgb, and i am not too sure how exactly they work
local res,packet1,packet2=sim.handleVisionSensor(sensor)
what exactlly is going on here??

coppelia
Site Admin
Posts: 7838
Joined: 14 Dec 2012, 00:25

Re: How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Post by coppelia »

that line (i.e. local res,packet1,packet2=sim.handleVisionSensor(sensor)) basically acquires a new image with the vision sensor specified. After the aquisition of that image, a vision callback function will be automatically called (if available). Then the call returns.

In the kinect model available in the model library (in components/sensors/kinect.ttm), sim.handleVisionSensor is implicitely called via the main script with sim.handleVisionSensor(sim.handle_all_except_explicit), since both vision sensors have their explicit handling flag unchecked.

Cheers

SS_Seagull
Posts: 3
Joined: 18 May 2020, 01:18

Re: How can I use the Python Remote API and kinect sensor to enable a robot to pick up objects??

Post by SS_Seagull »

Hello!

How would I go about instructing an IRB140 to move by giving it values (x,y,z) of my target object's position?
I am currently using sim.simxSetObjectPosition on the ManipulatorSphere since this is the only function that I am aware of that accepts cartesian points as input..

Second, I am unable to connect an end effector to the IRB140 using the Assemble/Disassemble option. It always pops off when i start the simulation, but the kinect camera attaches correctly. Is there any other procedure I am unaware of here?

Third, is there any alternative way of finding the coordinates for objects using the kinect?

Post Reply