S.L.A.M & Tracking Technology

Anna Sharpes
5 min readNov 8, 2020
https://www.youtube.com/watch?v=6vo8ZwaLUUc

S.L.A.M

From what I understood after reading Shweta Mayekar’s article, the acronym S.L.A.M stands for Simultaneous Localization and Mapping. It is an algorithmic technology and AR/VR application that translates data that’s gathered from the physical world as reference points and translated into a virtual environment in real-time. The reference points help the machine distinguish between the floors, walls, and other barriers that are in the area. It’s kind of like creating a virtual 3D spatial map that uses various physical environments as a resource. Using S.L.A.M is especially useful in unknown environments where no GPS scan or map signal is available at all in the area. Overall S.L.A.M helps AR and VR technologies create visually clearer and more stable digital realities.

https://www.andreasjakl.com/basics-of-ar-slam-simultaneous-localization-and-mapping/

The Parts That Make The Whole

I learned from reading professor Andreas Jakl’s blog that S.L.A.M is actually made up of four parts that help make it work overall. The four parts are called front-end, back-end, sensor data, and S.L.A.M estimate. Front-end is feature extraction, or the work of reducing the number of resources required to describe a large set of data from a certain physical environment. It needs to be related to actual landmarks or map points so that the technology can help reduce any drift that might happen by recognizing what it has already seen. Back-end localizes the camera, handles the overall geometric reconstruction of the virtual world, and takes special care of connected relationship between different frames. Sensor data is the data of mobile device’s cameras, gyroscopes, and accelerometers (an instrument for measuring acceleration). The data can be augmented by other sensors as well like a depth sensor or GPS. The last part of the whole, S.L.A.M estimate, is just the result of all of the contained tracked features, their locations and relation, along with the camera that is positioned in the physical world.

https://trackinno.com/2017/06/21/asset-tracking-technologies/

Tracking Technology

From reading the book, “AR: Principles and Practice”, I was able to learn about three important terms that overlap when it comes to the work of measuring and alignment of objects for an augmented reality. These three terms are tracking, calibration, and registration. Tracking is used to describe dynamic sensing and measuring of AR system so that virtual objects that are connected to physical object in a 3D space have the same relative position and orientation. Calibration is the process of comparing measurements made with two different devices, a reference device and a device that will be calibrated. A fun, small, thing I learned is that calibration, unlike tracking which is used all the time, is only used in discrete times or it may even only be used once in a device’s lifetime. The last term, registration, when it comes to AR refers to the the specific alignment of coordinate systems between the virtual and physical objects. For this to work, it usually requires a user’s head or camera to be tracked providing the video background, or even both at times.

Two Types of Registration

  • Static Registration — When a user or the camera is not moving at all, it requires calibration of the tracking system so that it can establish a common coordinate system between the virtual and physical object.
  • Dynamic Registration — When the user or the camera is moving, this type of registration requires tracking.
https://3dcoil.grupopremo.com/blog/electromagnetic-motion-tracking-virtual-reality/

Stationary Tracking Systems

There are certain types of tracking that can be used for different AR/VR systems that determines the best performance, cost, size, weight, and power consumption options for creators depending on what’s needed or wanted. Introduced in the 1990s, due to this type of tracking system’s stationary nature, the trackers below are not very popular AR Today. Although they aren’t very popular or used very much currently, I still thought they were interesting to learn about when reading from the “AR: Principles and Practice” book.

  • Mechanical Tracking — One of the oldest techniques, it builds on mechanical engineering methods to track a person’s motions and positions relative to the machine, usually something like a robotic arm, that user is attached or hooked up to.
  • Electromagnetic Tracking — Uses a stationary source that produces three, right angle, magnetic fields strength and direction to measure both the position and orientation.
  • Ultrasonic Tracking — Measures the time of flight of a sound pulse traveling from the source to sensor. For this to work, it needs to have multiple sensors send their pulses following an order of sequence, and there needs to be no disturbance from loud environment noises to avoid
https://www.iphonehacks.com/2017/05/apple-bosch-ar-experience-iphone-8.html

Mobile Sensors

The stationary trackers up above are suitable for certain types of VR applications that don’t require a user to move around much. However, that doesn't work very well for Augmented reality because their type of tracking system should be mobile, not stationary. The thing with mobile tracking systems is that they have to perform locally from mobile devices for AR users to be able to roam about in an unconstrained environment, whether it be indoors or outdoors. The problem with this is that it can limit processing power and applicable techniques that can operate with mobile sensors. All modern mobile devices, whether it be phones or tablets, are equipped with an array of sensors like, for example, GPS or wireless networks (WIFI, Bluetooth, and mobile phone networks) that can help determine a person’s position.

Conclusion

Currently, S.L.A.M for AR is still developing, although developing really fast. I think that this technology could be very useful for my group’s project. It could collect data of the location/position Catalyst’s building overall, but more specifically the teacher’s offices and create a virtual spatial map of the same physical layout. Then we could use the visual layout to implement our ribbon idea and use other AR technology, like one of the mobile sensors, to create the ribbon mapping system that will lead people in the right direction of the professor they wish to speak to.

https://www.youtube.com/watch?v=tcJHnHpwCXk

--

--