Skip to content

A ROS2-based perception stack integrating the Velodyne VLP-16 LiDAR and Wheeltec N100 IMU for sensor fusion, mapping, and SLAM. Demonstrates hardware integration, data collection, and open-source SLAM pipelines (FAST-LIO, KISS-ICP, Cartographer) for real-time localization and mapping.

Notifications You must be signed in to change notification settings

tthom289/Perception-SLAM-Stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

Perception-SLAM-Stack

A ROS2-based lightweight LiDAR mapping tool integrating the Velodyne VLP-16 LiDAR and Wheeltec N100 IMU for GPS-denied 3D mapping and localization. Designed for challenging environments like cave surveying, this project demonstrates practical hardware integration, sensor fusion, and implementation of state-of-the-art open-source SLAM algorithms.


Overview

This repository documents my journey building a professional-grade perception stack from the ground up, focusing on real-world sensor integration and SLAM algorithm implementation. Through iterative hardware testing and field deployments, I've developed practical expertise in robotic perception systems while exploring applications from vehicle-mounted mapping to aerial surveying.

Key Learning Areas:

  • Multi-sensor hardware integration and calibration
  • ROS2 ecosystem and sensor drivers
  • SLAM algorithm evaluation and tuning
  • Field data collection and post-processing pipelines
  • System packaging for mobile platforms

Demo

Handheld SLAM platform built around a Small Rig cage which gives many easy attachment points for PI, HMI, batteries, and VLP-16:

IMG_2487

Vehicle-mounted FAST-LIO mapping down a residential street:

SLAM.mp4

Hardware Stack

Current Configuration

  • LiDAR: Velodyne VLP-16 (used, sourced via eBay)
  • IMU: Wheeltec N100 9-axis IMU
  • Compute: Raspberry Pi 5 (8GB) running ROS2 Humble

IMU Selection Journey

Through testing multiple IMU units, I learned firsthand why sensor quality is critical for SLAM performance:

IMU Model Result Key Learnings
Bosch BNO085 Poor SLAM convergence low-cost MEMS sensors suffer from significant drift and noise in dynamic environments
Witmotion WT901B Unstable pose estimation calibration software extremely buggy could never get properly calibrated
Wheeltec N100 Successful mapping worked well, though higher-grade IMU would improve performance

Why IMU Quality Matters for SLAM:

  • Motion Prediction: High-rate IMU data (>100Hz) provides inter-frame motion estimates, crucial for scan matching initialization
  • Drift Compensation: Superior bias stability reduces integration drift during feature-sparse environments
  • Tightly-Coupled Fusion: Algorithms like FAST-LIO rely on accurate IMU pre-integration; noise directly degrades optimization convergence
  • Dynamic Motion Handling: Better gyroscope performance maintains pose accuracy during aggressive vehicle maneuvers or walking motion

Deployment Platforms

Vehicle-Mounted Configuration

Truck-mounted perception stack

Advantages:

  • Stable platform for algorithm validation
  • Higher speeds test scan matching robustness
  • Easy to collect large-scale mapping datasets

Lessons Learned:

  • Vibration isolation critical for IMU mounting
  • Vehicle velocity aids loop closure detection
  • Suction mount provides surprising rigidity at <30mph

Initial Handheld Configuration Design

Handheld Stack

Field Setup:

Backpack power system

Challenges Encountered:

  • Walking-induced IMU oscillations cause scan distortion
  • Handheld variability exposes IMU quality limitations
  • Power management more complex than anticipated

Future Development: Aerial Mapping to Cave Interior Surveying

Initial thoughts for a development roadmap follows a phased approach: first establishing aerial mapping capabilities with the VLP-16 stack, then transitioning to interior cave surveying with a more compact sensor package.

Phase 1: Heavy-Lift Platform

  • X650 hexacopter carrying VLP-16 + N100 IMU + Pi5
  • General aerial surveying
  • Proof-of-concept for aerial LiDAR workflows
  • Data fusion: LiDAR point clouds + RGB photogrammetry (RealityScan) for textured, high-fidelity 3D models

Phase 2: Compact Cave Interior Platform (Future) To enable drone-based mapping inside caves, I plan to downsize to a Livox AVIA LiDAR, which features:

  • Integrated IMU (eliminating separate N100)
  • Significantly reduced weight and form factor
  • Non-repetitive scanning pattern (ideal for SLAM)
  • Mountable on 5-7" FPV quadcopter frames

Hybrid Cave Surveying Strategy: The compact FPV platform would have limited flight time (~5-10 minutes), making it ideal for mapping large cave chambers and rooms, while handheld recording handles smaller passages and crawlways where drone operation isn't feasible. This two-pronged approach maximizes coverage while working within the constraints of each platform.

Aerial Stack Concept

X650 Payload Stack

Target Platform: X650 hexacopter Payload: VLP-16 + Pi5 + battery

Planned Capabilities

  • Offline Recording: Raspberry Pi 5 logs /velodyne_points and /imu topics to SSD
  • Post-Processing Pipeline: FAST-LIO batch processing on workstation for high-fidelity maps
  • Multi-Modal Fusion: RGB camera integration for textured point clouds
  • Photogrammetry Complement: Already experimenting with RealityScan for hybrid surveying workflows
  • Cave Survey Workflow: Aerial entrance + large room mapping -> handheld interior mapping -> registration and fusion of datasets

Rationale: Aerial platforms enable comprehensive cave entrance, surrounding terrain, and large room/passage mapping, while handheld systems handle interior passages. While commercial solutions like Inertial Labs RESEPI exist, building from scratch maximizes learning in sensor fusion, embedded systems, and autonomous flight integration—skills directly applicable to challenging mapping environments.


SLAM Algorithms Evaluated

  • Best overall performance with N100 IMU
  • Tightly-coupled iterated Kalman filter approach
  • Excellent in feature-sparse environments
  • Lightweight, geometry-only approach
  • Good for initial testing without IMU dependency
  • Limited performance in long corridors
  • Google's proven solution for indoor mapping
  • Heavier computational requirements
  • Strong loop closure but requires tuning for outdoor use

Data Pipeline

datapipeline drawio

Current Workflow:

  1. Mount stack on platform (vehicle/drone)
  2. Launch ROS2 drivers and record sensor topics
  3. Transfer bag files to workstation
  4. Run SLAM algorithm offline for optimized maps
  5. Export PCD/PLY for analysis and visualization

Project Goals

This project serves multiple objectives:

  1. Cave Mapping Application: Create a practical tool for 3D surveying of caves and underground environments where GPS is unavailable
  2. Lightweight & Portable: Design a system that can be deployed handheld, on drones, or mounted on ground vehicles for versatility
  3. Skill Development: Hands-on experience with perception systems, sensor fusion, and robotics middleware
  4. Algorithm Understanding: Practical evaluation of SLAM approaches in GPS-denied, feature-limited environments
  5. System Integration: Real-world hardware challenges (power, mounting, calibration, environmental factors)
  6. Portfolio Building: Demonstrate technical capabilities in robotic perception systems
  7. Open Learning: Document successes and failures for community benefit

Future Roadmap

Near-Term

  • Complete heavy-lift drone build for aerial surveying
  • Develop ruggedized handheld mapping configuration
  • Test SLAM algorithms in GPS-denied, low-feature environments
  • Integrate RGB camera for textured mapping in low-light conditions
  • Develop automated data processing pipeline
  • Create survey stich workflow: aerial mapping → handheld mapping → combined dataset

Long-Term

  • Develop lightweight drone with Livox AVIA for surveying
  • Deploy system in actual cave environments for field validation
  • Integrate real-time SLAM for autonomous drone navigation and GPS-denied waypoint missions
  • Investigate visual-inertial odometry for redundancy in feature-sparse areas
  • Compare LiDAR vs. photogrammetry reconstruction quality in caves
  • Explore integration with flight controllers for autonomous obstacle avoidance
  • Develop waterproofing solutions for wet environments

Lessons Learned

  1. Hardware matters more than expected: Algorithm performance ceiling is set by sensor quality
  2. Iterative development is key: Cheaper components taught valuable lessons before investing in quality hardware
  3. Real-world != simulation: Vibration, power management, and environmental factors dominate practical deployments
  4. Data collection is hard: Robust recording systems and backup power are non-negotiable

Note: This project represents ongoing learning and development in robotic perception systems. Feedback and collaboration opportunities are always welcome.

About

A ROS2-based perception stack integrating the Velodyne VLP-16 LiDAR and Wheeltec N100 IMU for sensor fusion, mapping, and SLAM. Demonstrates hardware integration, data collection, and open-source SLAM pipelines (FAST-LIO, KISS-ICP, Cartographer) for real-time localization and mapping.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published