1
\$\begingroup\$

I'm rendering the camera output to an rendertexture on a plane. This works fine. But I want to put GameObjects on the rendertexture on positions of world objects that are visible on the camera.

camera.WorldToViewportPoint(obj.transform.position) shows the correct positions, but is not mapped to the rendertexture. How can I do this?

So it's GameObject in world > camera input > plane with RenderTexture > world or local position of gameobject on top of rendertexture plane.

\$\endgroup\$
3
  • 1
    \$\begingroup\$ When you say "mapped to the rendertexture," do you mean matching the pixel coordinates of the rendertexture's texture space? Or its UV coordinates? Or do you have a quad / RawImage in your scene displaying the rendertexture, and you want the local / worldspace coordinates of the corresponding point on this display object? \$\endgroup\$ Commented Nov 8, 2017 at 15:46
  • \$\begingroup\$ Hi. Thanks for your reply. The world position of the pixel of the gameobject that is visible on the rendertexture. I want to position 3d objects on top of the rendertexture plane. For example, you see an chair. I put an camera to see the chair. Then the camera render has an output to plane rendertexture. I want to put an gameobject on top of the plane rendertexture. \$\endgroup\$ Commented Nov 17, 2017 at 13:31
  • \$\begingroup\$ To help you here, we'll need to know what kind of object you're using to display your RenderTexture, and how it's configured in the inspector and hierarchy. Depending on your setup, the coordinates might be measured from different parts of the image, or need compensation for scaling/etc. Editing your question to include some screenshots is probably the fastest and clearest way to show this info. \$\endgroup\$ Commented Nov 17, 2017 at 13:44

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.