大多数浏览器和
Developer App 均支持流媒体播放。
-
将 SceneKit 项目移植到 RealityKit
了解为什么要弃用 SceneKit,并探索如何将你的 3D 项目迁移至 RealityKit 这一 Apple 推荐的高级 3D 引擎。我们将阐明弃用 SceneKit 对你的项目意味着什么,比较这两个引擎之间的关键概念,并向你展示如何将一个示例 SceneKit 游戏移植到 RealityKit。我们还将探讨 RealityKit 在所有受支持平台上拥有的巨大潜力,以帮助你为 App 和游戏打造出精彩的 3D 体验。
章节
- 0:00 - Start
- 5:03 - Core differences
- 8:59 - Asset conversion
- 11:57 - Scene composition
- 15:21 - Animations
- 16:58 - Lights
- 18:37 - Audio
- 21:19 - Visual effects
资源
相关视频
WWDC25
WWDC24
WWDC23
-
搜索此视频…
Hi, welcome to the session. I’m Max Cobb, a software engineer here at Apple. I want to start the session off by taking a short look into the past. If you’re a SceneKit developer, you might recognize a super fun and inspiring sample game that Apple shared many years ago. In this project, a red panda, whose name is also Max, runs around a volcanic scene, solving puzzles to save his friends while fighting off enemies. This sample was built using a framework called SceneKit, which allowed developers to create native 3D apps without needing to inflate their bundles with an entire third-party game engine. SceneKit has been around for many years. Actually since OS X Mountain Lion. That’s 13 years ago. A lot has changed in the Apple developer ecosystem since then. New coding paradigms, new hardware, and new platforms. In other words, there's been a big shift in the way that people build and interact with apps. SceneKit was designed with an architecture that made a lot of sense at the time. But as the ecosystem moved on, that made it very challenging to keep SceneKit up to date without introducing breaking changes to existing 3D applications. That's why this year Apple is officially deprecating SceneKit across all platforms.
But what does that mean to existing projects? Is my app still going to work? Do I need to rewrite it? What can I use instead? Let’s break down what this deprecation means to SceneKit developers. First, let me clarify, there’s no need for you to rewrite your apps. This is a soft deprecation, meaning that existing applications that use SceneKit will continue to work. However, if you're planning a new app or a significant update, SceneKit is not recommended.
Secondly, SceneKit is now entering maintenance mode. Apple will only fix critical bugs, so don't expect new features or optimizations moving forward.
At this time, there’s no plan to hard deprecate SceneKit, and Apple will give developers ample notice if this ever were to change.
But if you want to use Apple’s cutting-edge technology and industry-leading combination of hardware and software, the best option is RealityKit.
RealityKit is a modern, general-purpose, high-level 3D engine. It's both powerful and approachable, giving access to industry standards, empowering you to create beautiful, realistic renders with many advanced features that can make your app really shine. It’s the technology powering many third-party apps and system features, like Quick Look on the Mac, for previewing 3D models with the tap of a button. The brand new App Store Tags on iOS 26 is one of the big changes to make your apps even more discoverable. App Store Tags have stylized 3D icons that are rendered with RealityKit. Here’s an example where the App Store surfaced a list of games that were tagged for having great 3D graphics. Also new in iOS, Swift Charts use RealityKit to deliver a third dimension to your data visualizations.
And RealityKit is not just used in visionOS, but it’s the backbone of that platform. Everything on visionOS leverages RealityKit, including the buttons in your applications and the windows that hold them. RealityKit makes your app content feel like it’s really there alongside your real-world environment.
RealityKit also puts SwiftUI in the front seat, so SwiftUI developers can feel right at home. It’s supported on: visionOS, iOS, macOS, iPadOS, and new this year, RealityKit is making its way to a new platform: tvOS, bringing another destination for apps and other content built with RealityKit. This framework is packed with advanced and exciting new features this year too. To learn more, please check out the session “What’s New in RealityKit” from my colleague Lawrence Wong. In today’s session, I will help you understand how RealityKit works when compared to SceneKit, what’s different, the new possibilities, and how you can get started coming from the SceneKit world. I also want to make my game more modern, ready for exciting new features and the platforms I’m thinking about. So during the session, I’ll be explaining the main steps I took porting this fun SceneKit game over to RealityKit. The full sample code for this project is available for you to download and explore. So here's the agenda. I’ll start by explaining the conceptual differences between these two rendering engines and how to interact with them. Next, there’s no 3D game without some cool assets. I’ll explore ways to convert existing SceneKit assets into the format of choice for RealityKit. I’ll show you the tools you can use to compose RealityKit scenes, and start comparing the features in SceneKit and RealityKit side by side, starting with animations. Giving my scene a stylish look by adding dynamic lights, add immersion and personality, custom audio, and bring it home with visual effects like particles post-processing. Everything I need to bring a game like mine from SceneKit to RealityKit. Alright, let’s dive in. In terms of concepts, I’ll focus on four key areas: architecture, coordinate systems, assets, and views.
Starting with architecture, SceneKit is node-based. That means that every object in the scene is a node, and these nodes have predefined behaviors in the form of properties.
Each node has properties for features, including geometry, animations, audio, physics, and lights.
For example, when I create an empty node, it has no geometry, no special properties, and is positioned at its parent's origin.
When I want to render Max in my game, I can assign the model to a node’s geometry property.
When Max walks around the scene, the app is assigning an animation player to the node and playing it.
The footsteps that you hear are coming from an audio player, also assigned to the same node.
So that’s how a node-based architecture works. Everything revolves around the node and its properties.
In contrast, RealityKit uses a design pattern called Entity Component System, or ECS for short.
This means that every object in the scene is an Entity, and I modify its behavior by attaching components to it. Every behavior in RealityKit is a component.
From its transform to advanced behaviors like physics, particles, and even portals, and the list of components keeps growing as RealityKit evolves. New components this year include image presentation and gesture components. These are the architectural differences to keep in mind when bringing SceneKit apps over to RealityKit. Next is an easy one. When coming to a new rendering engine, you can’t do much without understanding the coordinate system.
Coordinate systems are easy to translate between SceneKit and RealityKit because they are the same. In both systems the x-axis is pointing to the right, y-axis is pointing up, and z is directly toward the camera.
For assets, SceneKit is flexible in the model formats it accepts, and the engine serializes them into SCN files. This is convenient but it’s not a unified standard across the industry. Most of these formats are proprietary with varying feature support, bringing extra complexity to asset pipelines.
RealityKit on the other hand is designed around an open industry standard called Universal Scene Description or USD.
It’s a format introduced by Pixar in 2012 with the goal to solve for a few difficulties in the asset creation pipeline, including data exchange and collaboration. This is the standard of choice for Apple across all platforms. I’ll need to convert some SCN files to USD for my game, so I’ll dig into those details in just a moment. Before that, the last core difference I want to highlight is views. Views are fundamental building blocks that represent a portion of an app’s user interface. In the case of both SceneKit and RealityKit, it’s a viewport that renders 3D content.
With SceneKit, I can render content through an SCNView or a SceneView if using SwiftUI. There’s also an ARSCNView, which let’s me render virtual objects in the real world with ARKit. With RealityKit the path is simple: content renders through a RealityView, which was designed from the ground up for all the conveniences we’re so used to with SwiftUI. I can render entirely virtual scenes or place objects in the real world with just this one view.
The same content deploys and adapts across all supported Apple platforms, and even performs stereoscopic rendering in visionOS automatically, without any changes to the code. Great, so those are the main core concepts you should have in mind when transitioning to RealityKit: architecture, coordinate systems, asset support, and views. Next, every great game has to have some nice assets. So let’s take a look at what I currently have in my game. Inside the art asset catalog, I have a collection of 3D models. These models are in the SCN file format. This is great for a SceneKit project, but I need to convert all these assets to USD so I can use them in RealityKit. Let me show you some options. If you have the models in their original format, that would be the best choice. Chances are your Digital Content Creation Tool or DCC, offers good support for USD. I’m using Blender, so I can export the asset to this format straight away.
But if you don’t have the original files, there are a few other options for converting your existing SCN asset directly to USD. One method, and probably the easiest, is right in Xcode. To do so, select an asset. I’ll choose enemy1 this time. Then go to File, Export..., and in the Export options, choose a Universal Scene Description Package, which is a zipped USD file type.
The results of exporting this way may vary from asset to asset, but for most models that I have in my game, this works great. You might also have animations in separate SCN files, which is a common pattern in SceneKit. For instance, in my SceneKit game, animations for the main character like walking, jumping, and spinning are each in separate files with no geometry. But how do I export the animations to USD and apply them back to the geometry? Well, lucky for me, Apple updated a CLI that ships with Xcode 26 called SCN tool to help with this process. Let me show you how to use it. I can invoke SCN tool by typing xcrun scntool.
This displays a list of options available. To convert, I can type xcrun scntool --convert specifying the file max.scn in this case. And --format usdz as the output type.
This alone would convert the SCN file to USD in the same way as I did earlier in Xcode. To append the animation, I use --append-animation for each SCN animation file I want to export, max_spin in this case.
And save to desktop.
Let’s take a look at the converted file.
Great, my geometry has the animation information in USD format. I did this for all my assets and organized them in a way that works great for my preferred workflow. Now I’m ready to start piecing the game together. Which brings me to the next topic, scene composition. With the SceneKit version, the SceneKit editor in Xcode helped to put all the pieces together. RealityKit has a tool to help me do this too, and it’s called Reality Composer Pro. Reality Composer Pro sits between Xcode and my DCC of choice, such as Blender or Maya. I can use it to compose my scene, add components to entities, create and modify shaders, and prepare the scene for Xcode. I can bring all my newly created USD assets in and begin putting the game back together. Reality Composer Pro ships with Xcode. I’ll open it now.
I’ll create my project with the name PyroPanda.
Reality Composer Pro gives me a default scene without any content.
Next, I can drag all those newly converted assets into my project.
To add these assets to my scene, I can either right click and choose Add to Scene.
Or I can drag in an asset such as Max from the project browser into the viewport directly.
Once in, repositioning entities is straightforward. I can use this gizmo to put Max at the game’s starting point. More or less right there. Reality Composer Pro is a great tool to compose my scene in a visual way allowing me to edit materials, shaders, particles, lights, and more. Remember I said Reality Composer Pro sat in between my digital content creation tool and Xcode? Well, now I need to bring the content into my app. So that's the next task. The Reality Composer Pro project is a Swift package. I can add it as a local dependency in Xcode by going to my project package dependencies here, clicking on Add Local..., and choosing my app as the target.
Next, in my content view Swift file, I need to import RealityKit and my new package, PyroPanda, at the top here.
Within my ContentView, I’ll add a RealityView.
Then I need to load the scene as an entity, specifying that it comes from that package’s bundle.
And finally, add the new entity to my RealityView content.
I’ll also add a camera control just to show you the result.
I spent some time earlier building the scene up with Reality Composer Pro. Here's the fully composed result. I added the remaining models, assigned the textures, and created the dynamic shaders for the lava, plants, and one of the enemies, adding more personality to the volcanic environment.
You can check out the full sample code to inspect how each piece of this scene was built.
There’s so many things you can do with Reality Composer Pro. To learn more, I’d recommend checking out one of these two sessions from previous WWDC years. This is starting to come together. Now I’ll make little Max come to life with some animation. When I converted Max earlier, I also appended an animation. When a USD file has animations, RealityKit exposes them in an AnimationLibraryComponent. This makes it easy to access and play the animations on my entities. I reference all the different animations from a single USD file called “Max” In Reality Composer Pro I can see the references to all the animations in the inspector here.
I can trigger each animation in my project by the name specified in this component.
In the SceneKit version of this game, this is how I played the spin animation. First, I found the Max_rootNode in my scene and loaded the spinAnimation separately. Then, I traversed through the animation scene until I found the SCN animation player and saved a reference to it. Then added the animationPlayer to Max, with the reference “spin”, and played the animation on the node. In RealityKit, accessing the animation via the AnimationLibraryComponent makes this really convenient. First, I find the Max entity by name, just “Max” in this case. From there, grab the AnimationLibraryComponent from Max’s component set, and select the desired animation by name. And finally, play it back on the entity. As Max navigates around the scene, my completed app plays different animations that represent the movement. Check out the source code for the app to see how this all connects. Something that adds an element of realism and mood to any scene is lighting. And when well applied, the difference can be night and day. Lighting in my application can be completely achieved through Reality Composer Pro without any additional code. Let's see how that looks. I can add a light by tapping the insert icon here, at the bottom of my entity hierarchy, and selecting a directional light.
This is an empty entity with just a directional light component. For this light type, only the orientation changes how it illuminates other entities in the scene.
I’ll position it up here just for visual clarity, and rotate around the x-axis as such.
The light looks good, but it’s missing something. There's no shadows! In the component list, I can also add a directional light shadow component by checking this box.
From the starting point, I can now see how the terrain and Max are casting shadows onto the rest of the scene. Achieving the same through code is very similar. For SceneKit, I create an SCNLight, set the type to directional, and assign castShadow to true.
I then create a new node and assign the light to the node’s light property.
For RealityKit, I create an entity with two components; a directional light component and a directional light shadow component. A directional light is one of the light options available in RealityKit. Others include point lights, spotlights, and image-based lights, which use textures to illuminate your scene. My game is looking a little more visually dynamic now, so next I’ll add some audible elements to increase engagement a little more. Let’s take a look at the ambient audio track that's constantly looping in the scene. In my SceneKit project, I first load the ambient audio file as an audio source. I can then modify properties of that audio source to change how it plays. In this case, I want it to loop, and I don’t want it to be positional in the scene or spatial, meaning that the audio playback volume does not change based on the main camera’s distance to the source node. And finally, add an audio player to the terrain node, starting the playback. I can access audio files in RealityKit in a very similar way to how I access animations: through components. The component this time is called AudioLibraryComponent. I can configure the audio playback completely from Reality Composer Pro, rather than doing everything at my app’s runtime. Let's see how that setup looks. I’ve already attached an AudioLibraryComponent to the terrain entity with a reference to the ambient audio file. Since I don’t want the playback to sound like it’s coming from a specific source, I can add an ambient audio component to the same entity.
This audio resource is quite long, and RealityKit’s default behavior will be to preload the whole track to memory at runtime. Instead, I can change that behavior to stream the audio resource as it plays.
And when the track finishes, I want the playback to start again from the beginning, so I'll select the Loop checkbox.
Everything’s now wired up, and the audio is ready to be played in RealityKit. There's two ways I can do this. The first is through the AudioLibraryComponent.
I can start by fetching the AudioLibraryComponent from the terrain’s component set, reference the ambient audioResource by name, and play it back on the same terrain entity. RealityKit sees the settings I added with Reality Composer Pro, so it will automatically loop and stream the audio as an ambient track. Alternatively, I can use a little trick from a built-in entity action called PlayAudioAction.
With this approach, the PlayAudioAction looks at the target entity’s AudioLibraryComponent for me and finds the named audio file.
I convert that action into an animation and play it on the same terrain entity.
Entity actions are really helpful for minimizing the actual code in my application.
I use this action and some others for various events in my game. For example, whenever the main character jumps or attacks.
For the final step I’ll cover in this session, let’s take a look at the visual effects included in my game. Visual effects can turn a 3D scene from something that’s accurate to a truly emotive experience. Starting with particles, I have some really nice particle effects that were put together for the original game from right inside the SceneKit editor in Xcode. These particles are saved as SCN particle files. There’s no tool to directly convert these files into something that’s compatible with RealityKit, but I can make particle effects with very similar settings through Reality Composer Pro. Let’s go to Reality Composer Pro and see what editing particles looks like. I prefer to keep my particles as separate USD files, so I can add a new file by clicking here in Reality Composer Pro. I'll name it volcano_smoke.
I can add a Particle Emitter component right to the root entity of this file.
From there, by pressing the Play button above the components, the default particles begin to appear.
There are a few presets I can choose from, including one of my favorites, Impact.
This particle preset has a good texture for smoke, so it’s a great starting point for this effect. These settings may be familiar when coming from SceneKit, with some small differences.
I compared the settings in Reality Composer Pro to the ones in my original game’s particles in SceneKit, and have come up with a RealityKit Volcano Smoke effect similar to the SceneKit version.
Now that it’s done, I’ll drag that into my main scene and see what it looks like with all the other models.
Great, that is exactly what I was aiming for. The final step is post-processing. This is where you can add the finishing touches with a final pass of your rendered output before it appears in your app.
In the original game, the camera had a strong bloom effect, making bright lights bleed through the rest of the scene, adding a soft, radiant glow that enhances the game’s otherworldly atmosphere.
This was achieved by modifying a few properties on the scene’s active camera. While this brings a lot of convenience, developers, and especially game developers, often prefer to have a very tight control over effects like this, for performance as well as artistic preferences.
But what about RealityKit? So while a simple property to create this effect is deliberately not available in RealityKit, starting this year, you can instead add post-processing effects to RealityViews on iOS, iPadOS, macOS, and tvOS.
This does require some setup, but doesn’t mean you necessarily need to write all the metal shaders from scratch yourself. Apple has some highly optimized Metal performance shaders you can use to get started. I’ll create a bloom effect using the post-processing API, which I can add to my game. Starting with the original texture from my game, I want to extract the highlights and write those to a separate texture. I then blur it to feather the edges, and finally, composite that blurred texture on top of the original to create the bloom effect.
To do this in my app, I first need to define a type, BloomPostProcess, which conforms to PostProcessEffect.
Within the postProcess method, I create a temporary metal texture to write some of the bloom data too. Next, I can use a performance shader to extract only the brightest parts of my image, which is where I want the blooming to come from. I blur that area using a Gaussian blur. And finally, place that blurred image on top of the original texture. The full code for these steps is available for you in the download.
To apply this effect to my RealityView, I can create an instance of my new BloomPostProcess class and apply it to the rendering effects.
This adds a really nice final touch to my game. The environment becomes more vibrant and is really an amazing experience to play. My app has really come together now with RealityKit. It runs exactly the same on iOS, iPadOS, macOS, and tvOS. With one core code base and a RealityKit scene, I can launch my app across all these platforms right away. Let me show you this game running on RealityKit’s latest platform, tvOS, with controller support.
Now I can play my RealityKit game on Apple TV at home.
And on visionOS, there’s something even more special I can do. By placing my scene inside of a progressive immersion view, I can add a portal into the PyroPanda world that renders in full 3D right in front of me.
This kind of experience is only possible with RealityKit and visionOS.
Let’s take a look at what’s been covered today. SceneKit is deprecated. And while it’s important, it doesn’t mean you need to worry about your existing SceneKit applications anytime soon. The best path forward is RealityKit, bringing with it unique possibilities for your apps and games.
I’ve discussed the core differences in terms of concepts and tooling available for RealityKit developers, and some of the major steps I took migrating a game from SceneKit over to RealityKit.
I’d encourage you to check out these sessions to help you on your journey to making amazing apps and games with RealityKit, along with other sessions from previous years, as well as the RealityKit documentation. In the download for this sample app, There’s even more details about areas I didn’t have time to cover today, such as camera motion, driving character movement, and game controllers. Technology is always evolving. Apple wants the SceneKit deprecation to be as smooth as possible for developers like yourselves. We’re excited about the future of RealityKit and can’t wait to see what SceneKit developers do with it next. Thank you for watching and enjoy the rest of the conference.
-
-
16:33 - Animations in RealityKit
// RealityKit guard let max = scene.findEntity(named: "Max") else { return } guard let library = max.components[AnimationLibraryComponent.self], let spinAnimation = library.animations["spin"] else { return } max.playAnimation(spinAnimation)
-
18:18 - Directional Light Component in RealityKit
// RealityKit let lightEntity = Entity(components: DirectionalLightComponent(), DirectionalLightComponent.Shadow() )
-
24:37 - Create Bloom effect using RealityKit Post processing API
final class BloomPostProcess: PostProcessEffect { let bloomThreshold: Float = 0.5 let bloomBlurRadius: Float = 15.0 func postProcess(context: borrowing PostProcessEffectContext<any MTLCommandBuffer>) { // Create metal texture of the same format as 'context.sourceColorTexture'. var bloomTexture = ... // Write brightest parts of 'context.sourceColorTexture' to 'bloomTexture' // using 'MPSImageThresholdToZero'. // Blur 'bloomTexture' in-place using 'MPSImageGaussianBlur'. // Combine original 'context.sourceColorTexture' and 'bloomTexture' // using 'MPSImageAdd', and write to 'context.targetColorTexture'. } } // RealityKit content.renderingEffects.customPostProcessing = .effect( BloomPostProcess() )
-