Autonomous micro-rovers offer a cost-effective approach for exploring lunar lava tubes and play an essential role in on-site operations. However, lunar rovers encounter significant challenges, including prolonged darkness and constrained resources. These constraints encompass factors such as processing power, memory availability, and energy availability, all of which impact the accuracy of localization and mapping.
This Master's Thesis focuses on implementing Visual Simultaneous Localization and Mapping (vSLAM)/ Visual Odometry (VO) algorithms in resource-constrained lunar environments. To tackle these challenges, the approach involves integrating FPGA-SoC embedded processors with stereo cameras. The distinct lunar lighting conditions serve as a motivation to assess the performance of vSLAM/VO algorithms across a range of lighting scenarios.