MultiSet v1.9.0 Launches: Developer-First Visual Positioning System (VPS) Creation, Unity Mapping & Mobile ModelSets, Quest Navigation - Plus Faster First-Lock and WebXR Orientation Fixes
- Shadnam Khan
- Sep 14
- 5 min read
MultiSet AI, the enterprise Visual Positioning System (VPS) for large-scale AR and XR, today announced v1.9.0, an update focused on three customer-driven pillars: VPS creation, ModelSet creation & tracking, and localization/UX performance. The release introduces a Unity Mapping scene to help teams build white-label iOS Pro mapping experiences, an upgraded MultiSet mobile app with .glb upload for one-tool ModelSet setup and tracking, and a series of quality-of-life improvements that shorten time-to-first-lock and standardize orientation handling across WebXR. The build is available now; see the live Changelog and docs for details.
With v1.9.0, we focused on first-minute success. We’re putting VPS creation in developers’ hands, streamlining object-level tracking into a one-tool workflow, and smoothing the user journey from first-lock to navigation.
Visual Positioning System creation: Unity Mapping puts scan workflows in developers’ hands
Developers can now build end-to-end mapping flows - from capture on iPhone/iPad Pro (LiDAR) to cloud processing to map activation - directly inside their own Unity projects and branded apps. The new Unity Mapping sample scene provides the scaffolding for white-label-ready mapping experiences, letting teams define custom UI/UX for scanning, uploads, and status handling while leveraging MultiSet’s VPS to localize content precisely within those maps. The scene supports LiDAR-based iPhones and iPads, and the docs detail the pipeline from device capture to availability inside a developer’s MultiSet account. Mapping Sample Scenes.
MultiSet has showcased multi-level mapping of a 25,000 m², four-floor campus on an iPhone Pro demonstrating how handheld LiDAR can stand up to complex, enterprise-grade environments. Teams can reference this video when planning facility rollouts or benchmarking scan throughput across routes, floors, and lighting conditions.
For developers getting started, the Unity SDK overview consolidates installation steps, sample-scene imports, and configuration of API credentials via the Developer Portal, ensuring fast time-to-POC and compatibility with existing AR Foundation projects.
Helpful links:
Unity Mapping scene (LiDAR iOS): https://docs.multiset.ai/unity-sdk/sample-scenes/mapping
Unity SDK quick start & Samples tab: https://docs.multiset.ai/getting-started/multiset-unity-sdk
Developer Portal for credentials: https://developer.multiset.ai/
“Unity Mapping is designed to be workflow-ready. You can stitch it into your own fleet apps, brand the experience, and ship a ‘scan → process → localize’ pipeline that fits the way your team already works.”
ModelSet creation & tracking: a one-tool workflow from upload to localization
The updated MultiSet mobile app and platform workflow let teams create ModelSets for object anchoring directly from standard 3D files (e.g., .glb/.gltf), and then use those ModelSets to localize content relative to physical objects in the field.
That means developers can go from CAD or scanned models to trackable objects in a single, mobile-first path - minimizing tool switching and speeding up pilot-to-production timelines in manufacturing, logistics, and retail.
In this release, MultiSet also improved tracking stability on smaller objects, tightening performance for object-level anchoring in constrained spaces (e.g., compact fixtures, handheld tools, or shelf items). Combined with the ModelSet setup flow and the SDK’s tracking managers, these enhancements help maintain high-confidence locks in workflows where fine detail matters. (Developers can begin with the ModelSet “How-to” and then wire ModelSet codes into their Unity samples.)
MultiSet continues to support migration paths from common enterprise AR stacks - so if you’re moving from solutions like Model Targets, you can reuse existing .glb assets to seed ModelSets and accelerate time-to-value on the MultiSet platform.
Helpful links:
Create a ModelSet (.glb/.gltf workflow): https://docs.multiset.ai/basics/modelset-object-anchoring/how-to-create-a-modelset
ModelSet: Object Anchoring (overview & best practices): https://docs.multiset.ai/basics/modelset-object-anchoring
Migration guide (reuse .glb assets): https://docs.multiset.ai/workflow/migration-guide/vuforia-model-targets
Localization, tracking UX & accuracy: faster first-lock, smoother orientation, and ready-to-use navigation
Shorter time-to-first-lock. MultiSet v1.9.0 introduces a continuous localization-until-success behavior for the first attempt in a session, cutting down the “fiddle time” commonly associated with initial lock-on. For developers who need instant feedback loops, the Single Frame Localization samples (Unity and Quest) demonstrate the shortest path from camera frame to pose - ideal for POCs, kiosk flows, or “scan and go” utilities where speed matters more than long-horizon drift.
Orientation handling in WebXR. For teams targeting the mobile web or headsets with browser-based AR, the WebXR SDK has been hardened to work reliably in both portrait and landscape contexts. The docs also cover camera intrinsics and image capture guidance to keep pose estimation stable across devices and browsers. Developers can use the open repo as a blueprint for integrating MultiSet’s VPS localization into modern web stacks (React, Three.js, Vite).
Navigation scene & Quest SDK. Teams can accelerate wayfinding and training with the Navigation sample in the Unity SDK: pre-configured POIs, UI wiring, and NavMesh guidance provide a drop-in starting point for AR navigation experiences that work indoors/outdoors. On Meta Quest, the Quest SDK ships with install and sample import guides; Single Frame Localization demonstrates fast lock-in on device, and the SDK is optimized for centimeter-accurate positioning via MultiSet’s VPS.
Developer Portal 3D viewer improvements. When developers open large scans in the portal, the initial launch orientation of the 3D viewer has been improved—making it faster to get your bearings before attaching content, testing localization paths, or sharing previews with collaborators. Sign in via developer.multiset.ai to access your maps, ModelSets, and credentials.
Helpful links:
Navigation sample (Unity): https://docs.multiset.ai/unity-sdk/sample-scenes/navigation
Quest SDK install & sample import: https://docs.multiset.ai/multiset-quest-sdk/installation-guide
Single Frame Localization (Unity & Quest): https://docs.multiset.ai/unity-sdk/sample-scenes/single-frame-localization and https://docs.multiset.ai/multiset-quest-sdk/sample-scenes/single-frame-localization
WebXR sample repo: https://github.com/MultiSet-AI/multiset-webxr-sdk
WebXR integration docs: https://docs.multiset.ai/basics/integrations
Why it matters - for builders and buyers
For XR developers and solution partners: v1.9.0 reduces the friction between scanning, localizing, and delivering - with workflow-ready scenes for mapping, navigation, and single-frame pose estimation. Unity developers can import samples from the Package Manager, wire in credentials, and put a branded mapping pipeline into the hands of field teams in days rather than weeks. Meanwhile, the WebXR repo keeps browser-based AR in play for pilots where installs are a barrier.
For innovation leaders and operations teams: The one-tool ModelSet path (mobile upload → object tracking) standardizes how your org turns CAD/scan assets into operational AR workflows. Whether you’re doing guided inspections, line-changeovers, guided pick/pack, or in-store planogram checks, ModelSet-based anchoring helps bring repeatability and version control to object-level content.
For IT and platform owners: MultiSet remains scan-agnostic, cross-platform, and supports enterprise deployment options, including private cloud and on-prem—so you can keep sensitive spatial data in your controlled environment while rolling out at scale across Unity, iOS/Android, WebXR, and Meta Quest. See the product overview and SDKs page for the broader platform picture.
Availability
MultiSet v1.9.0 is live today. Teams can review the Changelog, import sample scenes in the Unity SDK and Quest SDK, try the WebXR sample, and sign into the Developer Portal to manage credentials, maps, and ModelSets.
Changelog: https://docs.multiset.ai/getting-started/changelog
Unity Mapping scene: https://docs.multiset.ai/unity-sdk/sample-scenes/mapping
Navigation scene: https://docs.multiset.ai/unity-sdk/sample-scenes/navigationMultiSet Developer Docs
WebXR sample SDK: https://github.com/MultiSet-AI/multiset-webxr-sdk
Create a ModelSet (.glb/.gltf): https://docs.multiset.ai/basics/modelset-object-anchoring/how-to-create-a-modelset
Developer Portal: https://developer.multiset.ai/
MultiSet SDKs overview: https://www.multiset.ai/arsdks
About MultiSet AI
MultiSet AI equips developers with everything needed to build large-scale, location-aware applications, 3D mapping tools, a state-of-the-art Visual Positioning System (VPS) SDK, and a unified developer platform. The system is scan-agnostic, cross-platform (Unity, iOS, Android, WebXR, Meta Quest), and offers enterprise-grade security with deployment options spanning public cloud, private cloud, and on-prem. MultiSet powers use cases across navigation, training & inspections, and object tracking, with a focus on reliability in dynamic, real-world environments. Learn more at https://www.multiset.ai/.
Comments