Custom depth buffer

Hi, I’m trying something to fix the zwrite problem in the picture. What I need to do is prevent the opaque water from appearing in the depth buffer. For this, I take a separate render from another camera and project it onto the model. I make sure that the opaque water is not visible on the that camera in question. (with layer) Then I can project the depth texture with the command " _task.ViewMode = ViewMode.Depth". But I have a problem here. I don’t know how to calculate the depth fade. (I need to color the water by Depth.)

If I understood correctly, with this method I can only interfere with depth texture, not its vertex shader?, So it’s impossible for me to calculate depth fade with this method? If this is the situation , how can I add my own custom depth shader to the camera?

Here is a small tutorial how to render custom objects depth buffer and use it for outlines: HOWTO: Render object outline | Flax Documentation

You could try rendering water surface depth and using it to manually reject or fade water pixels when rendering in transparent mode.

1 Like

This is awesome! Thank you!

Hello again. I don’t know if it’s clear enough in the video, but when I want to use the custom depth in the example, there is a delay in the depth render when the camera moves. Is there any way to prevent this?

The only change I made in the code is to set the CustomDepth to the material of the water. Just like in the picture:

I guess you probably use custom RenderTask which might be rendering depth after the main scene so the scene rendering gets a depth from the previous frame which has incorrect position. To solve this I think you can just adjust the order of the drawing for your task:

task.Order = -100;

I hope this will work out!

If you use PostProcessEffect script then it’s called after scene rendering so you ned to change it for RenderTask like here: HOWTO: Render a camera to a texture | Flax Documentation and register to Render event of the task.

1 Like

Thank you mafiesto4! Yes I was using “Post Process Effect”. Sorry, I’m a bit of a novice. I tried using “script” as in the link you gave instead of “Post Process Effect”. but I couldn’t understand how to initialize GPUContext and RenderContext. I tried it like this:

renderContext = new RenderContext();
context = new GPUContext();
.....
var desc = GPUTextureDescription.New2D(1465, 820, PixelFormat.D32_Float, GPUTextureFlags.DepthStencil | GPUTextureFlags.ShaderResource);
var customDepth = RenderTargetPool.Get(ref desc);
context.ClearDepth(customDepth.View());

// Draw objects to depth buffer
Renderer.DrawSceneDepth(context, renderContext.Task, customDepth, Actors);
.....

Then I am getting this error:

 00:10:06.541 ]: [Error] Failed to spawn object of type 'FlaxEngine.GPUContext'.
[ 00:10:06.543 ]: [Warning] Exception has been thrown during . Failed to create native instance for object of type FlaxEngine.GPUContext (assembly: FlaxEngine.CSharp, Version=1.3.6228.0, Culture=neutral, PublicKeyToken=null).
Stack strace:
  at FlaxEngine.Object..ctor () [0x00030] in F:\FlaxEngine\Source\Engine\Scripting\Object.cs:50 
  at FlaxEngine.GPUContext..ctor () [0x00000] in F:\FlaxEngine\Cache\Intermediate\FlaxEditor\Windows\x64\Development\Graphics\Graphics.Bindings.Gen.cs:1835 
  at OutlineRenderer.OnEnable () [0x0000c] in C:\Users\ElVahel\Documents\Flax Projects\Shadertest\Source\Game\pp.cs:79 
  at (wrapper native-to-managed) OutlineRenderer.OnEnable(OutlineRenderer,System.Exception&)
[ 00:10:06.543 ]: [Warning] Exception has been thrown during .
Failed to create native instance for object of type FlaxEngine.GPUContext (assembly: FlaxEngine.CSharp, Version=1.3.6228.0, Culture=neutral, PublicKeyToken=null)

I know this is a very newbie question/problem and I’m embarrassed to keep you busy with such questions. but even though I searched all night on the internet, and looked at the engine’s source code to maybe understand the method, I still could not succeed. Although the manual is very good, unfortunately the number of examples is very small.

I would appreciate it if you could guide me a little more on the subject.

Edit: I’ve also tried “OutlineRenderer : RenderTask” This time, I can’t define Render as a function.

 error CS0505: 'OutlineRenderer.Render(GPUContext, ref RenderContext)': cannot override because 'RenderTask.Render' is not a function.

You could try sth like this:

public class MyRenderScript : Script
{
    private RenderTask _task;

    public List<Actor> Actors;
    
    public override void OnEnable()
    {
        // Create rendering task
        if (_task == null)
            _task = new RenderTask();
        _task.Render += OnRender; // RenderTask will call OnRender method during rendering
        _task.Order = -100; // Before the main scene rendering
        _task.Enabled = true;
    }

    private void OnRender(RenderTask task, GPUContext context)
    {
        var desc = GPUTextureDescription.New2D(1465, 820, PixelFormat.D32_Float, GPUTextureFlags.DepthStencil | GPUTextureFlags.ShaderResource);
        var customDepth = RenderTargetPool.Get(ref desc);
        context.ClearDepth(customDepth.View());

        // Draw objects to depth buffer (use main rendering task to view)
        Renderer.DrawSceneDepth(context, MainRenderTask.Instance, customDepth, Actors);

        RenderTargetPool.Release(customDepth);
    }

    public override void OnDisable()
    {
        // Cleanup resources
        Destroy(ref _task);
    }
}
1 Like

Thank you. I wouldn’t have been able to figure it out if you hadn’t shared the example. Now I’m starting to understand the logic of the event. However, unfortunately the main problem is still not solved.

When I had the same problem with the “HOWTO: Render a camera to a texture” example, I solved the problem using OnFixedUpdate. I used the OnFixedUpdate event instead of OnUpdate to sync the render camera’s position to the main camera. I am sharing the video below:

But the OnRender method doesn’t have a camera so I don’t have/need a position to sync. Could it be a problem with the FreeCamera script? Or is there something else I’ve overlooked?

Maybe you could sync the camera during rendering? Also, take a look at Time Settings - by default game logic runs at 30 FPS and physics/rendering at 60 FPS - you might want to adjust it and set all to 60.

1 Like

I solved it like this: (I’m sharing it in case someone else needs it.) I used _task instead of MainRenderTask in Renderer.DrawSceneDepth function and got the render from _task’s output. This is probably what I should have done from the beginning, but I didn’t understand it at the time.

//Renderer.DrawSceneDepth(context, MainRenderTask.Instance, CustomDepth, Actors);
Renderer.DrawSceneDepth(context, _task, CustomDepth, Actors);

Then I added a camera to the _task and then used the OnFixedUpdate event as I mentioned earlier to sync its position. Thus, the problem is solved.

Thank you again, you’re so helpful!