Inuitive, a vision-on-chip processor company, announced the launch of its latest sensor modules: the M4.5S and the M4.3WN. Designed to easily integrate into robots and drones systems, both sensor modules are built around the NU4000 vision-on-chip (VoC) processor and integrates depth sensing and image processing with AI and VSLAM capabilities in order to provide robotic devices with human-like visual understanding.
The M4.5S provides robots with enhanced depth from stereo sensing along with AI-based obstacle detection and object recognition. It features the widest field of view in the industry at 88×58 degrees, shortest minimum-sensing-range of 9 cm, and wide dynamic operating temperature range of up to 50 degrees Celsius. The M4.5S is a highly power efficient platform that is designed to function as a self-sufficient depth sensor module and shorten time to market for commercial robotic systems with industrial design fix and limitations.
The Company’s other newly launched sensor module, the M4.3WN, features accurate tracking and VSLAM navigation based on fisheye cameras and an IMU together with depth sensing and on-chip AI processing. This enables free navigation, localisation, path planning, and static and dynamic obstacle avoidance the main challenges for AMR and AGV systems. The M4.3WN is specially designed in a metal case to best serve in industrial environment conditions.
“Our new all-in-one sensor modules expand our portfolio targeting the growing market of autonomous mobile robots. Together with our category-leading Vision-on-Chip processor, we now enable robotic devices to look at the world with human-like visual understanding,” says Shlomo Gadot, CEO and co-founder of Inuitive. “Inuitive is fully committed to continuously developing the best performing products for our customers and becoming their supplier of choice.
The M4.5S and the M4.3WN sensor modules’ primary processing unit is the Inuitive all-in-one NU4000 processor. Both modules are equipped with depth and RGB sensors that are controlled and timed by the NU4000. Data generated by the sensors and processed in real-time at a high frame rate by the NU4000, is then used to generate depth information for the host device.