How does a robot know where it is in any given environment? How can it adapt to changes in the environment without getting lost? Numerous simultaneous localization and mapping (SLAM) algorithms have attempted to solve this problem, all with unique pro’s and con’s. The challenge we’ve tried to resolve is how do you make SLAM adaptable to the application it is applied? Can we optimise any combination of sensors to achieve best navigation output for the robot?
NavStack is our in-house multi-sensor SLAM system. Designed from the ground-up to be robust, scalable and adaptable. It is the ideal SLAM development platform for bespoke, mission critical robotics. The benefit to the system is that it can be adapted to suit each individual project.
It is a purpose designed SLAM navigation system. The NavStack engine can utilise a number of different sensors in various configurations to provide a fit-for-purpose navigation solution for most small autonomous vehicles. It has been built from the ground up as a computationally efficient and intrinsically robust navigation system.
6DOF Map alignment.
Comfortably co-habit with processor/GPU-intensive algorithms on the same hardware.
Infinitely scalable maps.
Each descriptor typically covers an area of 100-500m^2.
Map Re-use and Sharing
Efficiently use pre-existing maps.
Share maps between two platforms in real-time.
Fail-safe Control Outputs
Reflect the best estimate of platform velocities, even when the position estimate is known to be bad.
Map Failure Recovery after SLAM errors
‘Bad data’ can be pruned out of the map.
Position information quantified and regained.