-
-
Save clayjohn/80d9b1e52e1b9a23445504e85a8d7890 to your computer and use it in GitHub Desktop.
extends Node2D | |
var myViewport | |
var myCanvas | |
func _ready(): | |
## Stop main viewport from drawing when we force draw | |
## should loop through all other viewports as well | |
VisualServer.viewport_set_active(get_viewport().get_viewport_rid(), false) | |
# initialize Viewport, needs canvas | |
myViewport = VisualServer.viewport_create() | |
myCanvas = VisualServer.canvas_create() | |
VisualServer.viewport_attach_canvas(myViewport, myCanvas) | |
VisualServer.viewport_set_size(myViewport, 64, 64) | |
## "active" instructs it to be drawn | |
VisualServer.viewport_set_active(myViewport, true) | |
## create CanvasItem and add to canvas just as in servers tutorial | |
var ci_rid = VisualServer.canvas_item_create() | |
VisualServer.viewport_set_canvas_transform(myViewport, myCanvas, Transform()) | |
VisualServer.canvas_item_set_parent(ci_rid, myCanvas) | |
var sprite = load("res://icon.png") | |
VisualServer.canvas_item_add_texture_rect(ci_rid, Rect2(Vector2(0, 0), sprite.get_size()), sprite) | |
## draw it once, this should be bundled into a function | |
VisualServer.viewport_set_update_mode(myViewport, VisualServer.VIEWPORT_UPDATE_ONCE) | |
VisualServer.viewport_set_vflip(myViewport, true) | |
VisualServer.force_draw(false) | |
var image = VisualServer.texture_get_data(VisualServer.viewport_get_texture(myViewport)) | |
var texture = ImageTexture.new() | |
texture.create_from_image(image) | |
$Sprite2.texture = texture | |
#exact same as above | |
VisualServer.viewport_set_update_mode(myViewport, VisualServer.VIEWPORT_UPDATE_ONCE) | |
VisualServer.viewport_set_vflip(myViewport, false) | |
VisualServer.force_draw(false) | |
var image2 = VisualServer.texture_get_data(VisualServer.viewport_get_texture(myViewport)) | |
var texture2 = ImageTexture.new() | |
texture2.create_from_image(image2) | |
$Sprite3.texture = texture2 | |
#prints 0 as no frames have actually drawn | |
print(Engine.get_frames_drawn()) | |
#reset the main viewport so everything actually draws to screen | |
VisualServer.viewport_set_active(get_viewport().get_viewport_rid(), true) |
I am not exactly sure what your filtering issue is (or what causes it), but yes, your above code is close to my suggestion. My suggestion was to set the filter flag on the output texture (what you call "out"). You have the flags line commented out right now, my suggestion was to ensure that the flags include the filter flag.
Appreciate the help. That is exactly the point to set it.
I see now that without assignment the options had filtering, and by forcing out.flags = 0
I could see that part is indeed working and isn't the problem.
The cause is the viewport's transparency option. I've been digging in the source, but nothing jumps out at me as to what the game's main viewport is doing differently to not have this issue with transparency. Is there some option that I'm overlooking?
Toggling this get's different results:
VisualServer.viewport_set_transparent_background(vp, true) # No transparency is fine. Other has dark edges.
Transparency off, all is fine, but can't use this since I need the transparency.
Transparency on, the viewport texture has dark edges.
And just as reference to anyone passing by, the camera transform isn't exposed in Camera2D, and the VisualServer doesn't have a viewport_get_canvas_transform()
, but fortunately Viewport nodes expose the canvas transform they're using as a property. I went with this.
var inv_cam = get_viewport().canvas_transform.affine_inverse()
VisualServer.viewport_set_canvas_transform(vp, cv, inv_cam)
@avencherus Nothing jumps out at me either. Maybe the blend mode of the sprite's material? My example doesn't include materials, so maybe the default blend mode in the server is wrong. When using a transparent background the background is set to transparent black, so that is why the edges are blending to black. There must be a setting to ensure that they maintain their color when blending with the background.
Oh, is it a potential engine issue? The default blend mode hypothesis may be right. The TextureRect can be fiddled with shaders after the fact, but if there is some way to use what's internal, I rather not add custom shaders to everything. Not sure yet what those performance implications would be.
Oh, is it a potential engine issue?
Potentially, but probably not.
There are no performance implications for specifying a material for your objects. You really should manage your materials on your own instead of relying on the default. That being said, I'm not certain that the issue is related to materials anyway.
I was messing around with something like this recently and see the same slightly-dark edges when rendering with a transparent viewport background. Did you ever work out what causes this and how to avoid it?
@BigZaphod I believe the black edges are caused by the fact that when transparent background is enabled, it clears with a transparent, black color (Color(0, 0, 0, 0)
). When rendering to a texture the sprite renders to the texture just fine. When sampling the viewport texture however, bilinear filtering causes the texture around the sprite edges to blend between the sprite color, and the the viewport's background, which is still black. I bet if you set the the texture filtering to disable on both viewport texture and sprite, the artifact would disappear.
If that's the case, the solution would be to normalize and dilate the color data of the viewport texture. Where alpha is less than one, but greater than zero, divide the color by the alpha. If the alpha is close to or equal to zero though, take the color information from the surrounding pixels.
Hi @clayjohn and thanks for this snippet I’ve successfully used many times.
I have one question though : it seems not working well when executed from a thread. Sometimes a get the texture, sometimes I get an empty image.
Seems like force_draw misses some thread sync logic.
Have you experienced this, and would you have any suggestion ?
@Flarkk When writing this I assumed it would be run from the main thread. There is currently no logic to synchronize multiple threads. You will need to check that drawing has actually finished before retrieving the texture.
You can try using VisualServer.request_frame_drawn_callback or the frame_post_draw
signal
@clayjohn thanks for the hint. But something still sounds wrong from my understanding: If force_draw guarantees that all viewports have been drawn afterwards, when called on the main thread (whatever the underlying mechanism : semaphore, condition_variable, etc ... to sync with the visual server thread), what’s different to call from another thread ?
I’ve also spotted the force_sync method, but the doc is really light on it. I wonder whether it can help or not.
Note that I’m developing in c++ NativeScript, so I’m a bit reluctant to use godot style signals or callbacks (preferring good old c++ style :-)
@Flarkk force_draw
does not guarantee that all viewports have been drawn. It just submits the draw commands to the GPU immediately instead of at the end of the frame.
Unfortunately, I don't think force_sync()
is going to help you much :(
https://github.com/godotengine/godot/blob/5fe89e8ccda8f46c811e6f3fd601ca09ae138e17/servers/visual/visual_server_raster.cpp#L130-L131
Again, I didn't write this code assuming it would be run from another thread, so I can't do much troubleshooting for you. You can try running the code with different thread models to see if that helps. I understand that Godot, by default, tries to run all VisualServer commands on the main thread so you could be getting a bit out of sync, maybe running in a different thread model will help. Otherwise, you are just going to have to figure out a way to wait until the draw commands have finished executing before requesting the data from the GPU.
Ok got it, many thanks.
For now I’ll stick in waiting a bit the draw command complete (as the work is done on a thread, it doesn’t freeze the entire game).
I’ll rewrite the whole thing with compute shaders when Godot 4.0 is out, as it should allow to completely separate textures rendering from the draw process.
@clayjohn Thanks for taking the time to reply, those explanations help.
With the
get_global_transform_with_canvas()
I went with that, because sprite offsets and pivots seem to be very difficult to account for from GDScript, and I didn't see any clear immediate way to factor them into the resulting global Transform2D. Was hoping for a clean solution with the API, and not have to revisit matrix math and rewrite things to invert what the engine is doing. Any simple ideas come to mind?Not sure which method is most time consuming at the moment. Calculate a transform with sprite offsets, or invert the camera matrix. Is there anything exposed to GDScript that gets the Camera2D transform that it sends to the viewport?
I'm not to sure what you mean by the filtering on the texture being rendered to. There isn't anything imported, it's just being taken from the viewport and put on a TextureRect. I experimented with several ways of trying to apply the filtering flag, and it doesn't seem like it has any effect. Are you refering to some other method, do I apply some transparent imported texture somewhere, or am I missing something?
This is a snippet of my most recent iteration. Is this close to your suggestion?