V-REP's GPU usage

Typically: "How do I... ", "How can I... " questions
Post Reply
renaud.detry
Posts: 16
Joined: 24 May 2013, 16:29

V-REP's GPU usage

Post by renaud.detry »

Hello,

I'm wondering how much GPU power V-REP makes use of. Concretely:

Will any of the demos provided with V-REP run faster with a GeForce GT 750M than with an Intel Iris?

Are there some features of V-REP that can easily saturate a modern GPU?

I'm rather clueless in this domain. The images rendered by V-REP look simpler than those produced by modern games, which tends to indicate that V-REP is a light GPU user, but I may be wrong. Thanks for your input!

RD.

coppelia
Site Admin
Posts: 10336
Joined: 14 Dec 2012, 00:25

Re: V-REP's GPU usage

Post by coppelia »

Hello Renaud,

V-REP uses the GPU only for rendering or handling vision sensors. In the case of vision sensors, if the produced image is large, this could slow down V-REP since the image is retrieved from the GPU in order to do processing (then sent back to the GPU). In future we hope to be able to do some image processings directly on the GPU.

Except for that, there is only the rendering that makes use of the GPU. You can check at the top of the main window how much time is needed to generate a frame. An empty scene should not do more than 10ms, otherwise you have a quite bad graphics card.

By default, each simulation step is followed by a display step, meaning: the simulation speed is directly dependent on the display speed. But you can skip display frames by clicking the rabbit toolbar button, or changing settings in the simulation settings. You can also turn off rendering during simulation with the model other/fast simulation mode.ttm

You can also activate the threaded rendering mode (the rocket toolbar button), in which case the rendering will not slow down the simulation. That feature is as of release V3.0.5 very unstable, but will be stable in next releave (January 2013). Threaded rendering can drastically increase the simulation speed given two conditions:
  • the programs should possibly not perform data creation/destruction, because in that case the rendering thread has to be momentarily halted (and the simulation thread will have to wait until the rendering thread has stopped, otherwise there is the risk that the rendering thread accesses destroyed data)
  • the programs should not handle vision sensors in threads, since the rendering thread will then have to take over the vision sensor handling. And here again, the simulation thread will have to wait.
Cheers

Post Reply