Best Practices for Spatial Tracking
Spatial tracking is what allows augmented reality experiences to anchor virtual content to real-world locations and objects. When you scan an environment or object properly, the AR system can recognize that space later and place digital content exactly where you want it, keeping it stable and properly positioned as users move around.

The technology works by analyzing a sequence of camera frames, identifying distinctive features, and building a map of the physical space. The better your scan, the more accurately the system can track the location and maintain stable AR anchors. This guide will show you how to create high-quality scans that result in reliable AR experiences.

We'll cover the essential techniques: how to move your camera through the space, how to ensure complete coverage of the area, what lighting conditions work best, and how to handle different types of surfaces and textures.
Moving Your Camera Through the Space
The spatial tracking system builds its understanding of your environment by comparing frames and finding the same features across multiple views. For this to work reliably, each frame needs to overlap significantly with the previous one, typically by 50% to 80%. This overlap lets the system track its position continuously and build an accurate spatial map.

The Right Scanning Technique

When scanning an object or location for AR anchoring, move around it smoothly rather than standing still and rotating. You want continuous, fluid motion that gives the system a clear sense of how the space connects together.

Keep your movements steady and controlled. Jerky motions create two immediate problems. First, they cause motion blur in your frames, which makes it harder for the system to identify and track features. Second, they create inconsistent spacing in your scan data, which can result in tracking gaps or weak spots where the AR anchor might drift.

Your scanning speed is important. Move too quickly and your frames will blur, causing the system to lose tracking. Move too slowly and you'll capture dozens of nearly identical frames that add processing time without improving the spatial map. Aim for a smooth, walking pace as you orbit around your subject.

What to Avoid

Moving your phone around while standing in place doesn't provide enough camera translation for effective spatial mapping. The system needs to see how features look from different positions in 3D space, not just from different angles at the same location.

Sudden movements and jerky camera work will break tracking temporarily. When that happens, the system has to work harder to relocate itself, and you may end up with disconnected segments in your spatial map.

Random, unplanned movement usually leaves gaps in coverage. These gaps become weak points where AR content might not anchor reliably or might drift during playback. A systematic approach works much better.
Capturing Complete Spatial Coverage
For AR content to anchor reliably, the system needs a comprehensive map of the space from multiple perspectives. Think of it as teaching the system what the environment looks like from every angle a user might view it from. The more complete your coverage, the more robust the tracking will be when someone views your AR content.

How to Scan in Multiple Passes

Begin at a height roughly level with the center of your object or the main area of interest. Walk in a complete circle around it, moving at a steady pace.

After completing your first pass, move to a lower position. Drop your camera below the center height and make another complete circle. Angle your camera slightly upward as you go. This captures the underside edges and base areas that weren't visible during your middle-height pass.

Next, raise your camera above the object or area and complete another full orbit. Tilt the camera down slightly to capture the top surfaces and overhead features. This three-layer approach ensures the AR system can track the space reliably whether users are standing, sitting, or viewing from different heights.
For environments with architectural details, furniture, or complex objects, add dedicated passes for those specific elements. Move closer and scan them from multiple angles. These detail scans give the system more reference points for precise tracking.

Critical Areas That Need Attention

The top surfaces of objects and overhead areas often get neglected. People naturally scan at eye level and forget to capture views from above. Make sure you include good overhead coverage, especially if users might look down at AR content from above.

The bottom and base areas require you to get your camera low. It might feel awkward, but these views are essential for complete spatial mapping. Users will approach from different heights, and the system needs data from all perspectives.

Recessed areas, corners, and cavities need dedicated attention. A single pass won't give the system enough information about these complex spaces. Approach them from multiple directions to ensure thorough coverage.

Distinctive features and points of interest deserve close-up scanning. If your AR content will anchor to specific details like artwork, signs, or architectural elements, scan those features from several positions to give the system strong reference points.
Lighting Conditions for Reliable Tracking
The spatial tracking system identifies and follows visual features across frames. When lighting creates harsh shadows or extreme highlights, it can confuse the system. Shadows that look like dark geometric features can become part of the spatial map, and if those shadows move or disappear in different lighting conditions, the tracking becomes unreliable.

The Best Lighting for AR Scanning

Uniform, diffused lighting produces the most reliable scans. This type of soft light doesn't create hard shadow edges that the system might mistake for geometric features. It prevents bright reflections that can wash out important surface detail. Most importantly, it keeps the appearance of your space consistent, which is exactly what the tracking system needs.

Overcast outdoor conditions are ideal for scanning. Clouds act as a massive natural diffuser, spreading sunlight evenly across your scene. There are no harsh shadows and no bright spots. The lighting is inherently stable.

For indoor scanning, use soft lighting sources or diffuse any hard lights you have. Bounce light off walls or ceilings rather than pointing it directly at your subject. If you're using studio lights, run them through diffusion panels or umbrellas.

Lighting Problems That Break Tracking

Direct light sources in your frame create lens flare and bright artifacts. These can confuse the spatial tracking system, causing it to identify false features or misread the geometry of your space. The flares also create inconsistency between frames, making it harder for the system to match features reliably.

Keep light sources out of your camera's view. Use the objects and walls in your environment to shield the camera from direct light. Monitor your frame as you move and adjust your path if you see flare appearing.

Moving shadows are particularly problematic for AR spatial tracking. When shadows shift during your scan, the system sees features appearing and disappearing. This can create unstable tracking points that work during your scan but fail under different lighting conditions later.

Scan when lighting is stable and consistent. Avoid partly cloudy days when shadows constantly change. Don't scan near windows during times when the sun is moving quickly across them. If you're indoors, make sure no one is moving around creating shifting shadows.

Very deep shadows hide surface detail that the tracking system needs. When areas are too dark, the system can't identify features there, creating blind spots in your spatial map.

Use fill light or reflectors to bring detail into shadowed areas. You don't need to eliminate shadows entirely, just make sure they're not so deep that texture and features become invisible.

Lighting Checklist for AR Scanning

Verify that your light is diffused without harsh edges. Confirm that no direct light sources appear in your camera frame. Check that lighting remains stable throughout your entire scanning session. Make sure shadowed areas retain enough visible detail. Watch for any moving shadows from clouds, people, or vehicles.
Surface Textures and Visual Features
The AR tracking system works by identifying and following distinctive visual features. For this to work, the surfaces in your environment need to have enough visual detail for the system to latch onto. Completely plain, featureless surfaces give the algorithm nothing to track, which can result in drift or lost tracking.

Problematic Surface Types

Overexposed areas where detail has blown out to pure white are invisible to the tracking system. These sections provide no features to follow, creating blind spots in your spatial map. When users experience your AR content later, these areas won't contribute to tracking stability.

Fix this by reducing light intensity. Move lights farther from bright surfaces. If your camera has exposure controls, bring down the highlights. Some cameras offer HDR modes that can capture detail in both bright and dark areas simultaneously.


Completely black, underexposed areas have the opposite problem with the same result. No visible detail means no trackable features. Dark corners and deep shadows become dead zones where the system can't maintain reliable tracking.

Add fill light to recover detail in dark areas. Position reflectors to bounce light into shadow areas without creating new harsh lighting.


Smooth, uniform surfaces present the biggest challenge for spatial tracking. Blank walls, glossy floors, reflective glass, and polished metal offer almost nothing for the system to track. A white wall might be visible to you, but to the tracking system it's essentially featureless.

When possible, include textured elements in your scan. If you're scanning a minimalist space with plain walls, make sure furniture, artwork, or architectural details are visible in your frames. The system can track these features and use them to maintain position even when the camera looks at the plain areas.


For temporary installations or testing, you can add markers to featureless surfaces. Even simple printed patterns placed strategically can give the system reference points. These markers can be removed after scanning if needed.

Some surfaces can't be easily modified. In these cases, adjust your framing to always include some textured elements alongside the smooth areas. A plain white wall by itself is untrackable, but when your frame includes the textured ceiling edge or a nearby piece of furniture, the system has enough information to work with.

Ideal Surfaces for AR Tracking

The best surfaces for spatial tracking are matte rather than glossy. They don't create bright reflections that can confuse the system. They have visible texture, patterns, or variation rather than being completely uniform.

Good contrast helps too, with areas of different brightness giving the system more distinct features to track. The surface characteristics should remain stable as you move around them, without dramatic changes in appearance from different angles.

Framing and Composition

Frame your shots to include plenty of trackable detail. If you're scanning an object, get close enough that it fills a good portion of your frame. If you're scanning a space, make sure each frame includes distinctive features rather than mostly blank walls or floor.

The tracking system works best when it can identify and follow multiple features simultaneously. Frames that show only featureless surfaces give it nothing to work with, even if those frames show the area you want to cover.
Summary
Creating reliable spatial tracking for AR experiences comes down to four essential principles.
Smooth, controlled camera movement through the space ensures continuous tracking and proper data overlap. Think of it as a steady orbit rather than random motion.

Complete coverage from multiple heights and angles builds a robust spatial map. The more thoroughly you scan your environment, the more reliably the AR system can anchor content and maintain tracking as users move around.

Consistent, diffused lighting eliminates the false features and instabilities that come from harsh shadows and bright highlights. Stable lighting conditions during scanning lead to stable tracking during playback.

Adequate surface texture and visual features give the system something to track. Watch for areas that are too bright, too dark, or too plain, and address them before they become weak points in your spatial map.

When you combine these principles, you'll create spatial scans that result in stable, accurate AR experiences. The system can only track what you give it, so taking the time to scan properly means your AR content will stay exactly where you want it, creating believable and reliable augmented reality experiences.