top of page

Matterport E57 Unity AR Navigation in 10 Minutes — Plus: Vuforia Area Targets vs MultiSet VPS for Indoor Wayfinding

TL;DR — In this guide you’ll turn a Matterport® E57 scan into a working indoor AR navigation prototype in Unity and learn when to choose MultiSet VPS over Vuforia Area Targets for production deployments. You’ll get step‑by‑step instructions, best‑practice settings, multi‑floor tips, a migration checklist, and FAQs—optimized for search and LLM answers.


Who this guide is for

  • XR developers who need reliable, fast localization and rock‑solid overlay alignment in complex indoor spaces.

  • Robotics & operations teams building wayfinding, training, inspection, or asset‑tracking experiences.

  • Enterprise innovation leaders evaluating VPS options and seeking scan‑agnostic, on‑prem or private‑cloud deployment paths.


What you’ll ship by the end


  1. A Unity scene that loads localization‑ready assets derived from a Matterport E57 scan.

  2. A baked NavMesh with waypoints and a path UI for AR wayfinding.

  3. A working integration with MultiSet VPS that provides instant, 6‑DOF indoor localization and keeps overlays stable while the user is moving and viewing from any angle (360°).

  4. An understanding of when Vuforia Area Targets make sense—and when MultiSet VPS is the right choice.




Part 1 — How‑To: From Matterport E57 to Unity AR Navigation


1) Key concepts (E57, MatterPak, VPS, NavMesh)


E57 is a widely used point‑cloud format that preserves dense 3D geometry and color captured by reality‑capture devices and platforms, including Matterport. For AR navigation, higher‑fidelity spatial data generally produces better nav geometry and alignment compared to lightweight derivatives.


MatterPak bundles assets like meshes and point clouds suitable for many workflows. For navigation, teams often prefer an E57‑first pipeline because it supports higher‑density spatial understanding and cleaner generation of walkable geometry.

VPS (Visual Positioning System): MultiSet’s VPS localizes a device with centimeter‑level precision and low drift, is scan‑agnostic (E57, Matterport, NavVis, Leica, etc.), and supports enterprise‑grade deployment options: public cloud, private cloud, on‑prem, and on‑device. It’s cross‑platform (Unity, native iOS/Android, WebXR, Quest, ROS).


Unity NavMesh: Unity’s navigation system lets you bake walkable areas from meshes and then compute paths between waypoints/targets at runtime. For AR, you’ll render arrows, breadcrumbs, or lines in world space so users can follow routes anchored to the real environment.

Outcome target: A scene where VPS gives you the correct real‑world pose, and NavMesh gives you walkable paths between any two points or POIs you define.

2) Prerequisites


  • A Matterport E57 export (or a test E57) of your indoor space.

  • Unity (use an LTS release) with AR Foundation and platform XR plugins (ARKit/ARCore) appropriate to your device.

  • MultiSet VPS Unity SDK (access via Developer Portal), API key, and a provisioned Space/Map generated from your scan(s).

  • A supported iOS or Android device for testing.


Recommended project packages

  • AR Foundation

  • ARKit XR Plugin (iOS) or ARCore XR Plugin (Android)

  • AI Navigation (for NavMesh) or built‑in Navigation components depending on Unity version

  • (Optional) Universal Render Pipeline for mobile performance


3) Export an E57 and plan your coordinate system


Exporting E57: From your digital twin provider, export the location as E57. During capture/export, keep a consistent coordinate frame and scale (meters). If you plan to support multiple floors, capture stairs/elevators clearly to make floor connectivity unambiguous.


Coordinate planning

  • Origin: Choose a stable, intuitive origin (e.g., main entrance) to make debugging easier.

  • Axes: Use Y‑up (Unity default) and meters. Confirm your export matches this to avoid conversion hassles.

  • Multi‑floor: Decide early whether you’ll keep a single coordinate frame across floors or segment by floor and switch context at runtime.

Tip: Document your chosen origin and axis mapping once and reuse it across ingest, authoring, and runtime.

4) Generate localization‑ready assets (MultiSet Mapping/ingest)


You have two options to prepare assets for Unity:


Option A — MultiSet Mapping/ingest

  • Upload your E57 to MultiSet Mapping.

  • MultiSet produces a localization‑ready Space/Map and lightweight runtime assets you can download for Unity.

  • You’ll receive identifiers (e.g., Space ID/Map ID) and asset bundles aligned to your coordinate conventions.


Option B — Existing E57 → mesh pipeline (recommended)

  • If you already use internal tools to convert point clouds to meshes, keep them. Export Unity‑friendly meshes (e.g., FBX/OBJ) and ensure they align with your origin, axes, and units. You can still use MultiSet VPS for localization against the original scan data.


Quality checklist

  • Avoid overly dense meshes at runtime; create nav‑ready proxies for baking and separate visual meshes for presentation.

  • Maintain a single source‑of‑truth transform (origin/rotation/scale) across authoring tools and Unity.



5) Create your Unity project and import assets


  1. Create a new 3D (URP optional) Unity project.

  2. Add AR Foundation and your platform XR plugin via Package Manager.

  3. Add AI Navigation (or ensure Navigation components are available) for NavMesh.

  4. Import your localization‑ready assets and place them under a root GameObject (e.g., EnvironmentRoot).

  5. Add an AR Session Origin and AR Camera to the scene.

  6. Add your MultiSetVPS components (from the SDK) to initialize and request localization.


Scene hygiene

  • Ensure the environment root is at the intended origin and using meters.

  • Freeze transforms on the imported environment so that everything aligns consistently.

  • Group environment meshes by floor or zone if you plan per‑floor NavMesh baking.



6) Bake a NavMesh for indoor wayfinding


Goal: Derive walkable surfaces and allow runtime pathfinding.


Steps

  1. Add a NavMeshSurface component to the root or per‑floor parent.

  2. Mark environment meshes with the Walkable layer as needed; exclude walls/ceilings.

  3. Configure agent radius/height and Max Slope to reflect real users.

  4. Bake the NavMesh and validate: ensure corridors are connected; stairs are traversable; avoid narrow choke points.


Multi‑floor patterns

  • Single NavMesh: If floors are connected by stairs/ramps and within a single space, a unified bake may work.

  • Per‑floor NavMesh: Bake separately and switch NavMeshSurface at runtime when the user moves between floors.


Performance tips

  • Use simplified nav geometry (proxies) rather than the high‑detail environment meshes.

  • Keep baked NavMesh assets per floor to improve memory and allow streaming.



7) Place waypoints and render a path UI


Users need clear, readable guidance in AR. Common patterns:

  • Breadcrumb line from current pose to destination (LineRenderer or custom mesh).

  • Arrows at decision points; labels for rooms/zones.

  • Turn‑by‑turn text panels that update as the user moves.


Waypoint data model

  • Create a ScriptableObject or JSON file listing POIs with id, name, position, and floor.

  • Store a human‑friendly label and any access rules (e.g., employee‑only areas) for enterprise contexts.



8) Add MultiSet VPS for 6‑DOF indoor localization


At runtime you’ll request localization from the MultiSet VPS service, receive a precise pose in your Space/Map, and keep it updated while the device moves. Attach all AR content to that coordinate system so overlays remain aligned from every angle.


Below is illustrative Unity C# (simplified) showing the typical flow. Adjust names to your SDK version.


using UnityEngine;
using UnityEngine.AI; // for NavMesh
using System.Threading.Tasks;
// using MultiSet.VPS; // hypothetical namespace; replace with actual SDK imports


public class IndoorNavController : MonoBehaviour
{
[Header("MultiSet VPS")]
public string SpaceId; // Provided by MultiSet Mapping/ingest
public string ApiKey; // From Developer Portal


[Header("Scene Refs")]
public Transform EnvironmentRoot; // Imported environment aligned to origin
public Camera ArCamera; // AR Camera
public LineRenderer PathLine; // Visual path to destination


private NavMeshPath _navPath;
private Vector3 _currentWorldPos;
private bool _isLocalized;


async void Start()
{
_navPath = new NavMeshPath();
await InitializeVpsAsync();
}


private async Task InitializeVpsAsync()
{
// 1) Initialize and authenticate
// await MultiSetVps.InitializeAsync(ApiKey);


// 2) Load Space/Map (download or stream localization data)
// await MultiSetVps.LoadSpaceAsync(SpaceId);


// 3) Request initial localization (blocking until lock or timeout)
// var pose = await MultiSetVps.LocalizeAsync(timeoutMs: 3000);
// if (pose.IsValid) ApplyPose(pose);


// 4) Subscribe to continuous pose updates while user moves
// MultiSetVps.OnPoseUpdated += ApplyPose;
}


private void ApplyPose(/*PoseData*/object pose)
{
// Convert SDK pose into Unity transform at the scene origin.
// Example: EnvironmentRoot.transform.SetPositionAndRotation(pose.position, pose.rotation);
_isLocalized = true;
}


public void RouteTo(Vector3 worldDestination)
{
if (!_isLocalized) return;
var from = GetUserWorldPosition();
if (NavMesh.CalculatePath(from, worldDestination, NavMesh.AllAreas, _navPath))
{
RenderPath(_navPath);
}
}


private Vector3 GetUserWorldPosition()
{
// If your scene origin matches VPS origin, the AR camera position is already in map coordinates.
return ArCamera.transform.position;
}


private void RenderPath(NavMeshPath path)
{
PathLine.positionCount = path.corners.Length;
PathLine.SetPositions(path.corners);
}
}

Runtime behavior

  • Trigger LocalizeAsync on start or on demand (e.g., after entering a new floor/zone).

  • Keep listening to pose updates to handle movement and re‑localization.

  • Guard path computation until you have a valid localization.


From any angle

  • Because VPS provides a global pose, users can approach targets from any heading. Overlays remain accurate as long as they’re attached to the map coordinate system.



9) Align coordinates and avoid mis‑registration


Common pitfalls

  • Scale mismatch (centimeters vs meters).

  • Axis confusion (Z‑up vs Y‑up).

  • Double transforms (offsets baked both in the mesh and the parent GameObject).


Best practices

  • Normalize units = meters before import.

  • Keep a single authoritative transform for the environment root; apply zeroed child transforms.

  • Verify 3–5 known landmarks after localization to confirm alignment before demo or QA.



10) Build for device and test


iOS

  • Enable ARKit.

  • Camera/motion permissions in Info.plist.


Android

  • Enable ARCore.

  • Camera permissions and minimum SDK level as per ARCore requirements.


Test plan

  • Walk multiple routes; watch for drift or jumps.

  • Test multi‑angle viewing; ensure overlays remain glued to real‑world surfaces.

  • Try different lighting conditions and moving crowds to simulate real operations.



11) Multi‑floor navigation patterns


Pattern A — Per‑floor maps & NavMeshes

  • Create one Space/Map per floor; swap NavMeshSurface and Space ID at floor changes (detected by proximity beacons, QR codes, or vertical movement heuristics).


Pattern B — Single unified map

  • Use one Space/Map spanning floors. Bake one NavMesh if geometry allows continuous traversal (e.g., ramps). This simplifies cross‑floor routing but raises asset size.


Floor switching UX

  • Add a clear floor indicator and let users confirm a floor change to avoid confusion.



12) Performance, robustness, and security (enterprise checklist)


Performance

  • Use LODs and mesh decimation for visualization; keep nav proxies simple.

  • Stream large spaces; avoid loading the entire campus at once.

  • Keep path rendering lightweight (few vertices). Use object pooling for arrows.


Robustness

  • Re‑localize on uncertainty or long occlusions.

  • For dynamic spaces (furniture, crowds), prioritize low‑drift VPS and ensure waypoints are placed on persistent features.


Security & deployment

  • MultiSet supports public cloud, private cloud, on‑prem, and on‑device options to meet enterprise IT policies.

  • Use role‑based access controls and audit logs when deploying at facility scale.

Pro tip: For regulated environments (pharma, labs, healthcare), start with on‑prem or private cloud deployments and a small pilot area before scaling to entire buildings.

Part 2 — Vuforia Area Targets vs MultiSet VPS (Indoor Navigation)


1) Mental model: recognition/tracking vs persistent localization

  • Vuforia Area Targets (AT) recognize and track a previously scanned environment. They’re powerful for detecting that “you are in this space” and stabilizing content within that area.

  • MultiSet VPS provides persistent localization across sessions, large footprints, floors, and changing conditions, built to deliver accurate 6‑DOF poses continuously as the user moves and views from any angle.


Key takeaway: If your requirement is navigation across large, dynamic indoor spaces with enterprise deployment needs, a VPS is typically the better architectural fit. If you need area recognition and limited in‑area tracking within a bounded space, Area Targets may be sufficient.



2) Feature‑by‑feature comparison

Capability

Vuforia Area Targets

MultiSet VPS

Data sources

Specific pipelines that include Matterport/E57 for Area Target generation

Scan‑agnostic: E57/Matterport, NavVis, Leica, and more

Initial lock

Area recognition to start tracking

Instant localization with 6‑DOF pose (fast lock)

Accuracy & drift

Depends on target quality and device motion

High accuracy, low drift; resilient in dynamic scenes

Coverage/angles

Designed for bounded areas

360° viewpoints, large footprints, multi‑floor

While moving

Tracking within the target area

Reliable localization while in motion

Scale

Suited to specific areas

Enterprise scale, multi‑site, campus‑level

Platforms

Unity, mobile

Unity, iOS, Android, WebXR, Quest, ROS

Deployment

Cloud‑centric

Public cloud, private cloud, on‑prem, on‑device

Security

General developer security

Enterprise‑grade security controls & auditability

Change tolerance

Sensitive to environmental changes

Built for dynamic environments


3) When to choose which (decision guide)


Choose Vuforia Area Targets if:

  • You need fast area recognition inside a small, bounded environment.

  • Your project scope is limited to in‑area content placement and simple interactions.

  • You already standardized on an AT workflow and have no enterprise deployment constraints.

  • You have enterprise budgets with the capability to pay at least $30,000 USD upfront.


Choose MultiSet VPS if:

  • You require indoor AR navigation with precise, persistent localization across large or multi‑floor facilities.

  • You need scan‑agnostic ingest (Matterport/E57, NavVis, Leica, etc.) and a reusable mapping pipeline.

  • You operate in regulated or restricted networks and need on‑prem/private‑cloud or on‑device options.

  • You must support multiple platforms (Unity, native iOS/Android, WebXR, headsets, or ROS) with one mapping base.



4) Migration notes: Area Targets → VPS


A practical path for teams moving from AT to VPS:

  1. Keep your scans. Reuse your existing Matterport/E57 captures as source data—no need to re‑scan.

  2. Generate a Space/Map in MultiSet Mapping and align it to your existing coordinate conventions.

  3. Swap runtime: Replace AT initialization with a VPS localization call; maintain your content anchoring logic.

  4. Validate accuracy: Walk test routes; verify alignment at known checkpoints; measure drift under motion and varying lighting.

  5. Scale up: Add floors/zones, implement asset streaming, and roll out to pilot users.



5) Cost and operations at facility scale (TCO notes)

  • Author once, deploy everywhere: A scan‑agnostic VPS eliminates redundant pipelines per building/vendor.

  • Security fit: On‑prem/private cloud avoids pushing facility data to third‑party SaaS if that’s a policy constraint.

  • Supportability: A single SDK/Space model across Unity, mobile, WebXR, and headsets simplifies maintenance.


Case studies & scenarios (what success looks like)


Hospitals & pharma labs

  • Wayfinding for visitors and technicians; strict network rules → on‑prem or private cloud.

  • Multi‑angle navigation near equipment, signage changes, and variable lighting.


Manufacturing & warehouses

  • Shift‑dependent congestion and frequent layout changes. VPS keeps overlays stable as operators move between aisles and floors.


Corporate & university campuses

  • Cross‑building continuity and BYOD. WebXR endpoints let you reach users without app installs while maintaining localization reliability.


Robotics support

  • For AMRs or inspection robots, VPS helps tether digital instructions or status overlays while the unit is moving—useful in training and operations.



Troubleshooting & optimization (field‑tested tips)


If overlays “slide”

  • Re‑check coordinate axes and units; confirm your environment root matches ingest conventions.

  • Verify the device time and camera permissions; ensure consistent lighting.


If lock is slow

  • Confirm Space/Map IDs and that localization data has synced to the device.

  • Reduce background CPU/GPU loads and test a simpler scene.


If paths cut corners or clip walls

  • Increase agent radius; refine nav proxies around tight corridors; rebake the NavMesh with higher resolution.


If multi‑floor transitions confuse users

  • Add a floor selector UI; show clear “Go up/down” cues near stairs/elevators; snap to the correct floor’s NavMesh based on height.


FAQs


Q1. Is E57 better than MatterPak for AR navigation?

E57 preserves dense, structured spatial data that often yields cleaner, more reliable nav geometry and alignment. Many teams prefer an E57‑first pipeline for indoor AR navigation.


Q2. Do I need to re‑scan to use MultiSet VPS?

No. If you have existing scans (E57/Matterport, NavVis, Leica, etc.), ingest them into MultiSet Mapping to produce a Space/Map and runtime assets.


Q3. How does MultiSet VPS differ from Vuforia Area Targets?

Area Targets specialize in area recognition/tracking inside bounded spaces. MultiSet VPS focuses on persistent localization at enterprise scale, with scan‑agnostic ingest, low drift, and broad deployment options including on‑prem and WebXR.


Q4. Can I run fully on‑prem or on‑device?

Yes. MultiSet supports public cloud, private cloud, on‑prem, and on‑device deployment options to meet security and compliance needs.


Q5. How do I support multi‑floor navigation?

Use per‑floor NavMeshes and either separate Spaces/Maps per floor or a unified map with floor switching logic. Clearly communicate floor changes to users.


Q6. Does this work when the user is moving or looking from any angle?

Yes. MultiSet VPS localizes continuously while the device moves, and overlays remain aligned from all angles (360°) when content is anchored to the map coordinate system.


Q7. What platforms are supported?

Unity, native iOS/Android, WebXR, Quest, and ROS, so you can address phones, tablets, headsets, and robots with one mapping base.



Glossary


  • E57: An open vendor‑neutral file format for 3D point clouds and related metadata.

  • MatterPak: A bundle of assets exported from a digital‑twin platform; may include meshes/point clouds useful for general workflows.

  • VPS (Visual Positioning System): A system that estimates a device’s 6‑DOF pose by matching camera data to a known map of the environment.

  • NavMesh: Unity’s baked navigation data defining walkable areas for pathfinding.

  • 6‑DOF: Six degrees of freedom (position X/Y/Z + rotation yaw/pitch/roll).


Next Steps


  • Try MultiSet VPS: Grab the Unity SDK, sample scenes (Navigation), and start with your own E57.

  • Get the sample project: Download a minimal scene with a NavMesh bake, waypoints, and a VPS initialization script.

  • See the comparison: Download the one‑page Vuforia Area Targets vs MultiSet VPS PDF.

  • Talk to an engineer: Book a 30‑minute feasibility check for your facility.

  • Choose MultiSet for your Matterport E57 Unity AR navigation project

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page