The city-scale augmented reality platform MAXST has announced their newest ‘Sensor Fusion SLAM’ solution, which integrates the use of a camera with IMU (Inertial Measurement Unit) sensor.
Simultaneous Localization and Mapping, widely known as SLAM is the core technology for autonomous driving. Most often, SLAM is a technology that combines the Lidar sensor with a camera and generates 3D maps of a real world. However, this method is hindered due to high expense and strenuous hours for regular updates. For such reasons, technical development of SLAM is applied to selected large companies at the moment.
On the other hand, ‘Sensor Fusion SLAM’ technology does not simply require a Lidar sensor, for it combines IMU sensor, which is already equipped on cameras in most smartphones, to provide as effective yet powerful SLAM. In the industries that utilize smart glasses, robots, and drones, the incremental usage of SLAM technology is evident for its tracking accuracy and precision.
Recently, Xiaomi launched a robot vacuum with ‘Sensor Fusion SLAM’ and this is one of many commercial uses.
While it has been mainly used as a core technology in large companies, small and medium businesses have been evolving technology using open-source. However, ‘MAXST Sensor Fusion SLAM’ shows superior performances compared to ‘Vins Mono’ or ‘ORB SLAM 2’, two of open-source resources.
MAXST envisions that their technology will enrich companies with areas other than AR, such as smart glasses, robots, and drones so such companies can solely commit to their field of work.