Quick RealityKit Tutorial 3: Custom Post-Processing

Dennis Ippel
3 min readSep 26, 2021

This is a quick tutorial about an exciting new RealityKit feature: the ability to add custom post-processing to the rendering pipeline.

Post-processing can be used to add rendering effects such as bloom, scanline, fog, depth of field, cel shading, color grading, film grain, vignette, object occlusion, etc. Some of these are already part of the RealityKit pipeline but every now and then a project comes along that requires custom effects.

Creating custom post-processing effects in SceneKit was very painful. SCNTechnique could be used but it was bug-ridden and didn’t fully support Metal. It was possible to create a custom post-processing pipeline based on this technique but it required a lot of work and was a bit of a hack.

Fortunately the latest RealityKit version makes all this very easy. Support for custom post-processing effects was introduced at WWDC 21 and will be part of iOS 15.

Tapping Into Render Callbacks

Reality’s ARViewon iOS 15 has a new instance property called renderCallbacks of type ARView.RenderCallbacks . This is a struct that has two closure properties: prepareWithDevice and postProcess . The first can be used to initialize properties that will be used for post-processing and the latter will be used to apply the effects on a per-frame basis.

In this example we will use a Metal Compute Shader but we could also use CIFilter effects here.

The postProcess closure has a ARView.PostProcessContext property that has everything we need to create a new effect:

  • The metal device (MTLDevice )
  • The current commandBuffer ( MTLCommandBuffer )
  • The projection matrix ( float4x4 )
  • The source color texture which contains RealityKit’s full scene render ( MTLTexture )
  • The corresponding source depth texture ( MTLTexture)
  • The target color texture which can be used to render our custom effects into ( MTLTexture )
  • The frame time ( TimeInterval

To get access to this context we can create a closure and assign it to arView.renderCallbacks.postProcess :

Before we get into this there are two things we’ll need to setup first: a a simple Metal shader and an instance of MTLComputePipelineState .

We’ll start with the Metal shader. For the sake of brevity we’ll create a simple shader that inverts the color. Create a Shaders.metal file and add this shader kernel code:

This function takes three arguments:

  • texture2D sourceTexture : this is the context.sourceColorTexture that contains the fully rendered RealityKit scene.
  • texture2D targetTexture: this is the context.targetColorTexturethat we’ll write the result into.
  • uint2 gridPosition : the texture pixel coordinates.

The actual shader doesn’t do very much it read the color from the source texture, inverts the rgb and then writes the result into the target texture. This is what will be displayed on screen.

The other thing we’ll need to create upfront is the compute shader pipeline state ( MTLComputePipelineState). A compute pipeline state is an object that contains a compiled compute pipeline. We’ll load the shader function we just created and attach it to the pipeline state:

One more important thing we have to do is compute the threadgroup and grid sizes. I won’t go into the details because this is well documented on the Apple Developer website.

The complete function looks like this:

Now it is time to tap into the render callback to perform custom post-processing. RealityKit makes this easy because it provides us with the textures and the current command buffer. These are the steps we have to take:

  • create a compute command encoder ( MTLComputeCommandEncoder )
  • set the pipeline state we created earlier
  • set the source and target texture
  • dispatch the thread groups
  • finish encoding

RealityKit takes care of the rest after this and the context.targetColorTexture will be shown on the screen showing the inverted rgb:

This simple example doesn’t do justice to this powerful feature but is a good starting point for more advanced multi-pass post-processing effects.

The complete example project can be found on GitHub: https://github.com/MasDennis/RealityKitPostProcessing

See Also

--

--