Distorted point cloud

Typically: "How do I... ", "How can I... " questions
Post Reply
nex
Posts: 7
Joined: 07 Nov 2016, 15:48

Distorted point cloud

Post by nex » 21 Sep 2017, 13:47

Hello,

I have a puzzling error when manually creating a point cloud, which causes it to be distorted.

I made a 3D laser scanner in V-REP, using a proximity sensor that can rotate from left to right and from top to bottom. Sweeping over a certain range of two angles (phi and theta), I record the distance measurements returned by the proximity sensor.
From the angles and the distance information I compute a point in 3D space, relative to the proximity sensor's origin (see sphereCoordsToCartesianCoords function in attached code) and add it to a point cloud object.

To validate my results I also created a xyz file that can be imported into MeshLab. V-REP and MeshLab agree on the result/visualization, but as I said in the introduction, the result is distorted.

Please see the link below for a simple example scene where I scan a couple of cubes, and dynamically display the result in a point cloud.
As can be seen, the scanned surfaces in the point cloud are not straight, but concave (left and right surfaces/walls) or convex (top wall/surface).



If you got any idea where I made a mistake, maybe in converting to carthesian coordinates, or somewhere else, I'd be glad for a hint. I am out of ideas.

Here is the simple project (couldn't find a way to attach it directly): http://mh-nexus.de/downloads/PointCloud ... ration.ttt

In case the code is enough to get an idea, here it is:

Code: Select all

sphereCoordsToCartesianCoords=function(yawAngle, pitchAngle, distance)
  local theta = yawAngle
  local phi = pitchAngle
  local r = distance

  -- the following spherical coordinate transformation is only valid given these
  -- assertions hold
  assert(r >= 0)
  assert((theta >= 0) and (theta <= math.pi))
  assert((phi >= 0) and (phi < 2*math.pi))

  x = r * math.sin(theta) * math.cos(phi)
  y = r * math.sin(theta) * math.sin(phi)
  z = r * math.cos(theta)

  return x, y, z
end


threadFunction=function()
    while simGetSimulationState()~=sim_simulation_advancing_abouttostop do

        simSetJointTargetPosition(yawJoint, yaw_angle)
        simSetJointTargetPosition(pitchJoint, pitch_angle)
        simWait(0.001)

        local result, dist=simReadProximitySensor(proxSensor)
        if (result>0) then
            local x, y, z = sphereCoordsToCartesianCoords(yaw_angle - min_yaw_angle, pitch_angle - min_pitch_angle, dist)
            xyzfile:write(x .. " " .. y .. " " .. z .. "\n")

            simInsertPointsIntoPointCloud(pointCloud, 1, {x, y, z})
            -- logfile:write(yaw_angle .. " " .. pitch_angle .. " " .. distance .. "\n")
        end   
        
        if (yaw_angle >= max_yaw_angle) then
            yaw_angle = min_yaw_angle
            pitch_angle = pitch_angle + angle_step
        else
            yaw_angle = yaw_angle + angle_step
        end

        if (pitch_angle > max_pitch_angle) then
            break
        end

        -- Now don't waste time in this loop if the simulation time hasn't changed! This also synchronizes this thread with the main script
        simSwitchThread() -- This thread will resume just before the main script is called again
    end
end

-- Initialization:
    simSetThreadSwitchTiming(200) -- We want to manually switch for synchronization purposes (and also not to waste processing time!)

    xyzfile = io.open("proximityReadings.xyz", "w")

    -- Get some handles first:
    yawJoint=simGetObjectHandle("proxJointYaw") 
    pitchJoint=simGetObjectHandle("proxJointPitch")
    proxSensor=simGetObjectHandle("proxSensor")
    pointCloud=simGetObjectHandle("Point_cloud")

    -- range 0 .. pi
    min_yaw_angle = (-3 * math.pi) / 2
    max_yaw_angle = -math.pi / 2
    yaw_angle = min_yaw_angle

    -- range 0 .. 0.75*pi
    min_pitch_angle = (-5 * math.pi) / 4 
    max_pitch_angle = (-2 * math.pi) / 4 
    pitch_angle = min_pitch_angle 

    angle_step = 0.05

    simSetJointTargetPosition(yawJoint, min_yaw_angle)
    simSetJointTargetPosition(pitchJoint, min_pitch_angle)
    simWait(3)

-- Execute the thread function:
    res,err=xpcall(threadFunction,function(err) return debug.traceback(err) end)
    if not res then
        simAddStatusbarMessage('Lua runtime error: '..err)
    end

-- Clean-up:
    xyzfile:close()

coppelia
Site Admin
Posts: 6224
Joined: 14 Dec 2012, 00:25

Re: Distorted point cloud

Post by coppelia » 22 Sep 2017, 09:03

Hello,

indeed, there must be an error in your calculations, things are slightly distorted. The two joints and the proximity sensor seem to be correctly aligned/configured.

You can slightly modify your code to see the data you want to expect with:

Code: Select all

        local m=simGetObjectMatrix(proxSensor,-1)
        local result, dist, pt=simReadProximitySensor(proxSensor)
        if (result>0) then
            pt=simMultiplyVector(m,pt)
            local x=pt[1]
            local y=pt[2]
            local z=pt[3]
--            local x, y, z = sphereCoordsToCartesianCoords(yaw_angle - min_yaw_angle, pitch_angle - min_pitch_angle, dist)
Cheers

nex
Posts: 7
Joined: 07 Nov 2016, 15:48

Re: Distorted point cloud

Post by nex » 26 Sep 2017, 21:32

Thanks a lot for your reply.

Your code works, but I can't rely on the "magic" simGetObjectMatrix() in my real world robot. So I tried to adjust my code as close as possible to your approach, while only relying on angles of servo motors and the distance reading of my physical proximity sensor.

I adjusted my code to use matrix multiplication, similar to your code and the error remains (at least exactly as when using polar coordinates, so it is consistent). I still can't find out what causes the error. Apparently the position of the proxSensor stays the same (ignoring minor rounding errors), so I assume you can ignore translations and just focus on the rotations. Or is there anything else an object's matrix expresses, besides translations and rotations?

Code: Select all

    local pt = {0, 0, distance}

   -- my method adapted to use matrices
    local Ry = simBuildMatrix({0,0,0}, {0,yawAngle,0})
    local Rx = simBuildMatrix({0,0,0}, {pitchAngle,0,0})
    local R = simMultiplyMatrices(Rx, Ry)
    local wrongAngles = simGetEulerAnglesFromMatrix(R)
    local ptres = simMultiplyVector(R, pt) -- wrong result (but consistent with earlier method using polar coordinates)

    --your method
    local m = simGetObjectMatrix(proxSensor, -1)
    local correctAngles = simGetEulerAnglesFromMatrix(m)
    local R2 = simBuildMatrix({0,0,0}, correctAngles)

    local ptres2 = simMultiplyVector(R2, pt)  -- correct result
Your method yields the right results, while my method still shows the same distortions as before when I didn't use matrices (and used the polar coordinate method in the first post).

I am trying to narrow down the difference, but I have trouble visualizing what the issue is. Shouldn't simGetEulerAnglesFromMatrix() provide euler angles that are independent of ignores translations or any other transformation (besides rotations)? But wrongAngles and correctAngles do differ significantly enough (when looking at logged values).

The reason why I need to understand the issue, is that on my physical robot, I can't use simGetObjectMatrix() (which sort of "cheats" using simulator information) and have to rely on servo angles to compute the matrices.

That's why I have two joints to get values similar to what servos angles would provide.

coppelia
Site Admin
Posts: 6224
Joined: 14 Dec 2012, 00:25

Re: Distorted point cloud

Post by coppelia » 29 Sep 2017, 17:59

It can be very confusing to assemble the various transformation matrices. If you look at a joint, it has 2 matrices: its position/orientation relative to its parent, and an intrinsic translation or rotation. So basically Mtot=M*Mi

I can't really dive too much into that, but I would write it like:

Code: Select all

    local pt = {-distance,0,0}
    local Ry = sim.buildMatrix({0,0,0}, {0,0,yawAngle})
    local RR = sim.buildMatrix({0,0,0}, {0,-math.pi/2,-math.pi/2})
    local Rx = sim.buildMatrix({0,0,0}, {0,0,pitchAngle})
    local R = sim.multiplyMatrices(Ry, RR)
    local R = sim.multiplyMatrices(R, Rx)
    local ptres = sim.multiplyVector(R, pt) -- wrong result (but consistent with earlier method using polar coordinates)
It still gives wrong results, but it is a little bit more correct ;)

Another thing that could happen: your joints are dynamic, and you are not using the current joint position, but the target joint positions: so maybe the joints still haven't reached the desired angular value, yet you use that angular value to compute the laser detection.

Cheers

nex
Posts: 7
Joined: 07 Nov 2016, 15:48

Re: Distorted point cloud

Post by nex » 30 Sep 2017, 18:48

I could finally figure out how to compute the point cloud correctly, without relying on internal knowledge of the simulator.

I still have to understand some of the math (such as order of operations and how exactly each coordinate system relates to one another) to know where I went wrong before. Once I did that, I'll try to write an explanation, or at least provide a simple working scene with code comments.

So far, in case anybody cares: one of the main issues was that the rotations of joints (so their intrinsic/immanent rotation angles that are set using functions like simSetJointPosition() or simSetJointTargetPosition()) are always rotations around the joint's z-axis. I was trying to express it relative to the joint's parent coordinate system, and apparently made some mistakes when I tried to "summarize" those calculations.

I may try to simplify those calculations again, or just leave them as is: step by step matrix multiplications, which clearly show how you get from one frame of reference to the next higher level one (in the parent-child hierarchy of objects) until the world reference frame (= absolute coordinate system) is reached.

Thanks for the tips along the way.

Post Reply