A utility tool for visionOS developers to debug and calibrate the offset (position and rotation) of 3D objects attached to hand anchors in real-time.
📖 Read the Article: Check out the full Medium article explaining the implementation details and the specific
Info.plistcrash this project solves.
Getting the perfect alignment for held objects—like swords, tools, or sports equipment—is difficult when guessing coordinates in code. This project provides a runtime dashboard to adjust these values on the fly while wearing the headset.
- Real-Time Calibration: Adjust X, Y, Z position and rotation (Euler angles) using a floating SwiftUI window.
- Instant Feedback: The 3D object (e.g., a Golf Club) updates instantly in the Immersive Space as you move the sliders.
- Console Output: One-click button to print the final coordinate values to the Xcode console for easy copying into your production app.
- ARKit Hand Tracking: Uses
HandTrackingProviderandAnchorEntity(.hand)to attach objects to the user's palm.
- Xcode 15.2+ (with visionOS SDK installed)
- Apple Vision Pro (Required for Hand Tracking) or visionOS Simulator.
If you fork this project or recreate it, be aware of a common crash on launch:
Thread 1: Fatal error: Your app was given a scene with scene session role UIWindowSceneSessionRoleVolumetricApplication but no scenes declared in your app body match this role.
The Fix:
This app uses a standard 2D window (.windowStyle(.plain)) for the controls, but Xcode defaults new projects to expect a Volumetric window.
We solved this in Info.plist by changing the Preferred Default Scene Session Role:
- From:
UIWindowSceneSessionRoleVolumetricApplication - To:
UIWindowSceneSessionRoleApplication
See Info.plist in the source code for the correct configuration.
CalibrationView.swift: The 2D control panel containing sliders forGolfConfig.GolfImmersiveView.swift: The immersive space that handles the ARKit session and updates the 3D entity transform.GolfConfig.swift: An@Observabledata model that stores the offset values and converts them toSIMD3andsimd_quatf.HandAnchorDebuggingApp.swift: The entry point setting up theWindowGroupandImmersiveSpace.
- Clone the repo:
git clone https://github.com/Lucalangella/HandAnchorDebugging.git
- Open
HandAnchorDebugging.xcodeprojin Xcode. - Connect your Apple Vision Pro or launch the Simulator.
- Run the app.
- Click "Start Rig" to enter the Immersive Space.
- Look at your Right Hand to see the attached object.
- Use the window to adjust the alignment until it fits perfectly in your hand.
- Click "Print Final Values" to see the coordinates in the debug console.
This project is licensed under the MIT License. See the LICENSE file for details.
*Created by Lucalangella*