Unity has released the beta version of PolySpatial. This platform for the Apple Vision Pro allows developers to create 3D experiences for VisionOS using face tracking, object detection, and more.
Unity has opened the beta version of its PolySpatial development platform for the upcoming Apple Vision Pro mixed reality headset. PolySpatial is designed to help developers port and create 3D experiences for VisionOS, making it easier for developers to create interactive applications.
The Apple Vision Pro includes a range of features, such as face tracking, object detection, and scene understanding. Developers can use these features to create games that react to the player’s environment, track their facial expressions, or identify objects in the game world.
This new tool provides developers with access to Apple’s computer vision and machine learning APIs, allowing them to create more engaging games for AR/VR.