Quick RealityKit Tutorial 1: Programmatic non-AR Setup

Dennis Ippel
5 min readApr 1, 2021

We can safely say that SceneKit is being abandoned in favor of RealityKit. Why? The past two years no new features have been added, serious bugs haven’t been fixed and there have been no SceneKit mentions at WWDC.

RealityKit on the other hand has been at the forefront. It has great potential but documentation is scarce and some important features are missing. As of speaking there’s no way to set up custom geometry and there is no support for shaders.

Apple engineers are actively encouraging developers to start using RealityKit instead of SceneKit. No only for ARKit apps but also for apps that use ‘regular’ 3D rendering.

So it is time to start looking into RealityKit. Most of the examples use AR and Reality Composer but I was keen on getting a programmatic non-AR example working. It turns out that this isn’t very well-documented. I found some pieces here and there and decided to put them together in this quick tutorial.

Here’s what I want to accomplish in this example:

  • Use an HDR map for environment lighting and as the scene background.
  • Programmatically create a reflective material and attach this to a sphere.
  • Create an animation loop for the camera so that it circles around the sphere.

Sounds simple enough but I had to do some digging to get this working. Here we go.

Setting Up The View

Create a new Xcode project using the App template. This will create a bare-bones project with a single UIView.

This is all we need to create the ARView programmatically:

let arView = ARView(frame: view.frame,
cameraMode: .nonAR,
automaticallyConfigureSession: false)
view.addSubview(arView)

By specifying .nonAR in the second parameter we make sure that we create a 3D only view without augmented reality. Because we’re not using AR we don’t have to configure a session.

Setting Up The Environment

I want to use this HDR image that I downloaded from HDRI Haven. Setting an .hdr or .exr image in SceneKit is as simple as:

scene.lightingEnvironment.contents = "myhdrimage.hdr"

This is a bit different in RealityKit. First you have to create a folder (in Finder) with a name and a .skybox suffix. So for instance /aerodynamics_workshop.skybox . Then place the .hdr or .exr file inside and drag the folder into the project navigator in Xcode. Choose ‘create folder reference` and add the folder the the app target. Xcode will compile the image as an environment resource.

This resource can now be used for environment lighting and the scene background. The resource can be referenced by using the file name without the extension:

let skyboxName = "aerodynamics_workshop_map" // The .exr or .hdr file
let skyboxResource = try! EnvironmentResource.load(named: skyboxName)
arView.environment.lighting.resource = skyboxResource
arView.background = .skybox(skyboxResource)

Note the difference with SceneKit’s SCNMaterialProperty : here you would set the environment in a more consistent manner:

scene.lightingEnvironment.contents = "myhdrimage.hdr"
scene.background.contents = "myhdrimage.hdr"

Creating A Sphere

Now we’re going to add a reflective sphere to the scene. First we’ll create a material:

var sphereMaterial = SimpleMaterial()
sphereMaterial.metallic = MaterialScalarParameter(floatLiteral: 1)
sphereMaterial.roughness = MaterialScalarParameter(floatLiteral: 0)

Now we can use one of MeshResource ‘s static methods to create our sphere primitive and assign the material we’ve just created:

let sphereEntity = ModelEntity(mesh: .generateSphere(radius: 1),
materials: [sphereMaterial])

Now we’ll need to create an AnchorEntity , place it at the center of the scene, add the sphereEntity as a child and then add the anchor entity to the scene:

let sphereAnchor = AnchorEntity(world: .zero)
sphereAnchor.addChild(sphereEntity)
arView.scene.addAnchor(sphereAnchor)

Creating and Animating a Camera

The last thing we’ll need to do is add a perspective camera. A static camera isn’t that exciting so we will animate it in a circle while looking at the center where our sphere is.

Creating and adding the camera is straightforward:

let cameraEntity = PerspectiveCamera()
cameraEntity.camera.fieldOfViewInDegrees = 60
let cameraAnchor = AnchorEntity(world: .zero)
cameraAnchor.addChild(cameraEntity)
arView.scene.addAnchor(cameraAnchor)

Creating a custom Animation requires a bit more work in RealityKit. Entities can be animated if they implement the HasTransform protocol. This protocol defines a couple of .move() methods. A transform can be passed to these methods, as well as a duration and an animation timing function. This is fine for simple animations but to accomplish something a bit different we’ll have to hook into the render loop.

In SceneKit you can use delegates to hook into different stages of the render loop to execute code in the SceneKit render thread. In RealityKit we can do the same by subscribing to SceneEvents.Update . First we’ll have to import Combine framework. Combine customizes handling of asynchronous events by combining event-processing operators.

import Combine

Next we’ll need to create an instance variable that will hold a strong reference to a Cancellable object. If this would be a method variable it wouldn’t work because it would be deallocated immediately.

private var sceneEventsUpdateSubscription: Cancellable!

Now we can subscribe to SceneEvents.Update which is triggered once per frame interval:

sceneEventsUpdateSubscription = arView.scene.subscribe(to: SceneEvents.Update.self) { _ in
// do stuff
}

Now we can use this to create our custom camera animation:

let cameraDistance: Float = 3
var currentCameraRotation: Float = 0
let cameraRotationSpeed: Float = 0.01
sceneEventsUpdateSubscription = arView.scene.subscribe(to: SceneEvents.Update.self) { _ in
let x = sin(currentCameraRotation) * cameraDistance
let z = cos(currentCameraRotation) * cameraDistance
let cameraTranslation = SIMD3<Float>(x, 0, z)
let cameraTransform = Transform(scale: .one,
rotation: simd_quatf(),
translation: cameraTranslation)
cameraEntity.transform = cameraTransform
cameraEntity.look(at: .zero, from: cameraTranslation, relativeTo: nil)
currentCameraRotation += cameraRotationSpeed
}

Here we establish circular motion by using sin and cos to calculate the camera’s position. We set the camera’s transform directly and we make sure that the camera looks at the scene’s center ( .zero ) where the sphere is.

Putting It All Together

Here’s all the bits put together into one view controller.

Wrapping Up

That ends this quick tutorial where we’ve set up something really basic. RealityKit is certainly lacking a lot of things and cannot be considered fully mature enough to be able to replace SceneKit at this very moment. It does however offer face, object and people occlusion out of the box as well as physics and better performance. It certainly looks promising.

I mentioned before that documentation is minimal and there’s information scattered around the internet so you’ll have to do some digging to find whatever information you’re after.

Hopefully this quick tutorial helped you save some time.

--

--