ARKit PlaneDetectionProvider: classification and alignment
We can filter ARKit PlaneAnchors based on classification or alignment values.
https://stepinto.vision/example-code/arkit-planedetectionprovider-classification-and-alignment/

ARKit PlaneDetectionProvider: classification and alignment
We can filter ARKit PlaneAnchors based on classification or alignment values.
https://stepinto.vision/example-code/arkit-planedetectionprovider-classification-and-alignment/
ARKit PlaneDetectionProvider: visualize detected planes
Pealing away the abstraction from "Placing content on detected planes" by Apple to see what we need to do to convert an anchor into something we can render and see.
https://stepinto.vision/example-code/arkit-planedetectionprovider-visualize-detected-planes/
I’ve been busy with client work this week, so while I don’t have anything *actually* new to show you, let’s all cast our minds back to this fun AR project that I made in ye olden times and pretend it’s new together, yes?
Step Into Newsletter - January 5, 2025
Working with hand anchors, transitioning between immersive spaces, and getting started with Spatial SwiftUI.
#AppleVisionPro #visionOS #RealityKit #ARKit #SwiftUI
https://stepinto.vision/articles/step-into-newsletter-january-5-2025/
Here is my progress on day 2. I have now also implemented #ARKit to show my mannequin in the real world.
Messing around with some Medieval AR prototypes tonight. Which neigh would you choose?
#AR #SpatialDesign #EP1320 #TeenageEngineering #MR #XR #ARKit #OrdinaryObjects
This Friday @fosiaDesign and I got an app idea we got really excited about. We spent the weekend playing around with RealityKit, CoreML, and ARKit and got a little tech demo working
#visionpro #visionos #arkit Did you know that RealityKit files (.reality) open in Quick Look on visionOS 2 and can be manipulated like USDZ files that are opened in Quick Look.
Try it out. Create framed photos with "FrameIt Vision" 2.0 ( https://itunes.apple.com/app/id6472856422), export them to Files and then open them from the Files app.
We didn't get RoomPlan yet, but this new provider is not far from fantastic
#wwdc24 #arkit
https://www.elkraneo.com/roomtrackingprovider/
macOS 15 SequoiaやiOS/iPadOS 18ではFinderやプレビューアプリ、コマンドラインツールなどで3Dデータフォーマット「USD」と「MaterialX」のサポートが強化。
https://applech2.com/archives/20240620-usd-and-materialx-3d-files-in-macos-15-sequoia-and-ios-18.html
Apple veröffentlicht macOS 14.6 Beta
Apple hat am Montag die Beta 1 von macOS 14.6 für Entwickler:innen veröffentlicht. Diese Beta folgt auf die Veröffentlichung der macOS 15 Sequoia Beta in der letzten Woche während der WWDC24. Eine öffentliche Beta-Version von 1
https://www.apfeltalk.de/magazin/news/apple-veroeffentlicht-macos-14-6-beta/
#Mac #News #Apple #ARKit #BetaVersion #CoreSpotlight #Entwickler #Finder #Installation #MacOS146 #MacOSSequoia #SoftwareUpdates #VideoToolbox #WWDC24
Done with client work for this week! Now onto #ARKit and #RealityView
If anyone has some experience with either of these two and want to get on a call to bounce ideas, hit me up!
@twostraws Did you write that `handTracker.calculateDistance(from:to:)` method yourself or is that provided by ARKit? I can't find it…
I stumbled across these old screenshots from June 2017. The requisite hardware took a little longer than I expected. #visionOS #visionPro #ARKit
Lumalabs AI
https://lumalabs.ai
#ycombinator #Luma #Nerf #Camera #Capture #iPhone #Neural_Radiance_Fields #Neual_Rendering #Object #Scene #ARKit #iOS #Web #App #JavaScript #WebGL #Augmented_Reality #Virtual_Reality #Metaverse #Nerfs #TestFlight #mesh #meshes #USDZ #GLTF #blender #CGI #VFX #Photo #Video #Volume #Photogrammetry #Volumetric #Voxel #Pixel #Real #Virtual #Photography
So, is it still not possible to capture a high-quality image while using ARKit?
I'm trying to place an object where the user takes a photo in AR. I've modified the AR app template to make a custom AR view, and then moved the object creation code to a button action. I'm trying to create an ARAnchor using the camera transform and placing the object using that, but nothing’s appearing in the view (the template code places the object on a plane and that worked).
c. line 129 in this gist: https://gist.github.com/JetForMe/adf1e04e4ec92c3dc7a29ce0910fb9a3?ts=4
Apple’s examples really want you to place objects on planes (https://developer.apple.com/documentation/arkit/arkit_in_ios/environmental_analysis/placing_objects_and_handling_3d_interaction), but that’s not what I want to do.
ARKit devs, what’s the portion of ARKit that would allow me to precisely localize a phone in a known space? Like to within a centimeter? I thought it might’ve been Location Anchors but I think that’s not it.