Apple this week revealed its latest vision SDK. visionOS SDK is a set of tools that will help developers in the overall creation of apps for the Apple Vision Pro mixed reality headset.
Apple is ready-made the global launch of its first spatial computer in the US. With respect to this launch, Apple is equipped with app developers along with the necessary tools to create apps for this device. If we talk about the other products of Apple, then the upcoming mixed reality headset will employ three different modes of interaction that are eyes, hands, and voice. Through the visionOS SDK, developers will be enabled to customize and harness these capabilities and take advantage of the specialized hardware integrated into the device.
The SDK is built upon a similar foundational framework which was previously used in Apple’s various operating systems. SDK leverages familiar development tools that are inclusive of Xcode, SwiftUI, RealityKit, ARKit, and TestFlight.
Susan Prescott, Apple’s vice president of Worldwide Developer Relations, have marked her words by saying that developers can build visionOS apps using the powerful frameworks they are already aware about. They are taking the team of development even further with new innovative tools and technologies like Reality Composer Pro. This thing is done to design all-new experiences for their users.
The availability of the visionOS SDK for developers is reflected by the website of the Apple Developer. To create spatial computing apps for the Apple Vision Pro, developers are eased with downloading the Xcode 15 Beta 2, which includes the latest visionOS SDK and a tool called Reality Composer Pro for visualizing and previewing 3D content on the headset.
With effect of this tool, the developers can now utilize a visionOS simulator to interact with their apps during the development process. This simulator enables them to test about how their apps will look in various lighting conditions or different room layouts. By taking the help of SDK, users can either adapt an existing app project for the headset or build a brand-new application from scratch.
Apple is planning to expand its developer tools with the introduction of Reality Composer Pro. This Xcode feature sorts the preview of 3D models, images, sounds, and animations on the headset. A virtual experience without the requirement for hardware and simulator is also facilitated. Unity development tools will all be incorporated, addressing the initial absence of gaming experiences in the original presentation.
The visionOS SDK emphasizes the Pro’s focus on enterprise applications. Stephen Prideaux-Ghee, PTC’s chief technology officer of AR/VR, mentioned how manufacturers could use AR solutions from PTC for the collaboration of critical business problems, bringing interactive 3D content into the real world using Apple Vision Pro. This collaboration will ultimately help the stakeholders from different departments and locations to review content simultaneously, aiding in design and operation decisions.