The long-awaited reveal of the Magic Leap One: Creator Edition capped off a milestone year for Augmented Reality and Mixed Reality. fxguide takes a snapshot of the AR & MR landscape and the implications for experience design.

Broad Visions

In 2017 Google’s ARCore and Apple’s ARKit releases put markerless tracking into handheld mobile devices. Microsoft’s Mixed Reality (formerly Windows Holographic) was released as a standard component of Windows 10. The vision of seamless AR across mainstream mobile devices is getting closer but we are not quite there yet. While AR/MR vendors continue to think big, terminology continues to confuse, with broad and sometimes overlapping definitions ready to trap the uninitiated.

Representing the latest generation of mobile AR, both Google’s ARCore and Apple’s ARKit build up a dynamic model of the physical world around the device using a combination of visual (camera) and inertial (motion sensor) data with different algorithms. ARKit’s technique is called Visual Inertial Odometry (VIO) which relies on low level integration between camera data and Core Motion data from onboard sensors (accelerometer and gyroscope). ARCore builds on Google’s first mobile AR effort - Project Tango. It uses Concurrent Odometry and Mapping (COM) to track a wider variety of planes (e.g. ramps, vertical walls) than ARKit which only tracks horizontal planes.

Prior to release, much of the iOS 11 / ARKit speculation suggested the inclusion of tracking technology known as Simultaneous Localization and Mapping (SLAM) - a broad set of problems and algorithms defined by the robotics community - some of which leverage 3D point cloud data provided by depth cameras. However the first release of ARKit (followed swiftly by the release of ARCore) provided a more lightweight platform that does not require depth sensors. Simpler tracking without SLAM opened up ARKit compatibility to older devices such as the iPhone 6S from 2015 (but not unfortunately to the iPad Mini 4 from the same year). ARCore compatibility is currently limited to Google Pixel, Pixel XL, Pixel 2, and Samsung Galaxy S8 devices.

Apple’s iPhone X introduced a depth camera (TrueDepth) but one designed for facial AR tracking not general purpose location tracking. The TrueDepth system projects 30,000 infrared dots onto the user’s face to track facial poses and expressions. Importantly, the TrueDepth technology in the iPhone X is only in the front facing of the iPhone's two cameras.

Without much fanfare, older AR tracking technologies like geo-location and image recognition (i.e. of tracking markers) continue to be useful even if the Apple/Google rhetoric can suggest that these have been superseded by ARKit/ARCore. From a designer’s perspective, vendor lock-in remains an issue for any non-trivial use cases that span image recognition and markerless tracking.

Imagine a scenario where you focus your mobile device on a specific real-world object to have it recognised as a marker (say a movie poster or toy) as a trigger to spawn a location-specific experience that is seamlessly tracked as you walk away from the marker. Neither ARKit or ARCore are sufficient to make this work without additional components, focused as they are on markerless tracking.

The recent Vuforia 7 release aims to help developers deal with this AR fragmentation by providing AR cloud services and a single API (Vuforia Fusion) to over 100 different Android and iOS device models. The Vuforia Ground Plane feature provides limited support for horizontal plane tracking to older devices that are not ARKit/ARCore compatible. An exclusive partnership with Unity means that Unreal developers are not supported.

Veja mais aqui.

-
FxGuide

Cadastre-se gratuitamente na Lista VIP

. Você sabia que existem sites da tainGment que só podem ser acessados quem faz parte da Lista VIP da tainGment?
. Você sabia que existem cursos gratuitos e conteúdos de valor que só quem faz parte da Lista VIP pode ter acesso?
Cadastre-se gratuitamente abaixo para ficar por dentro e faça parte da nossa Lista VIP!