Skip to main content

Cranfield University Uses Panther for Aircraft Inspection in Hangars

· 9 min read
Alicja Sadowska
Marketing Manager @ Husarion

Aircraft inspection is a complex task that demands high precision. These inspections often involve detecting subtle defects in aircraft components, such as cracks or dents.

There is a steady increase in the use of robots for aircraft testing, driven by their cost-effectiveness and the ability to reduce the error associated with manual inspections. However, the broader adoption of robotic systems has faced challenges, particularly the difficulty of accurately positioning robots within the complex environments of aircraft hangars, which lack GPS signals. Yet, accurate localisation is crucial for maintaining aircraft through meticulous and precise inspections.

Two students’ research projects at the Digital Aviation Research and Technology Centre at Cranfield University focus on enhancing the precision of robotic localisation in such conditions using two innovative methods, with the Husarion Panther UGV platform.

Cranfield University Uses Panther for Aircraft Inspection in Hangars

Project #1 - Visual tracking integration for Panther’s platform navigation inside the smart hangar

The hangar environment complicates robot navigation due to its large, featureless spaces, inconsistent lighting, and reflective surfaces, which can interfere with sensor readings. Traditional localisation methods like GPS are ineffective due to signal interference, while IMUs and odometry struggle with drift and inaccuracies.

Cranfield’s University Smart Hangar

Figure: Cranfield’s University Smart Hangar

Sakshi Chavan from Cranfield University (MSc in Robotics) came up with the idea of using multiple external cameras to capture images of Panther inside the hangar, which are then processed using YOLOv8, a deep-learning model for real-time object detection. These detections generate pixel coordinates, which are transformed into real-world positions.

To enhance the localisation accuracy, the data from external cameras, IMUs, and odometry sensors are merged through sensor fusion techniques using the Extended Kalman Filter (EKF).

This framework is built in ROS 2, developed and tested in a simulation environment using Ignition Gazebo before real-world deployment.

Yolo Robot detection

Figure: Yolo Robot detection

She chose a Panther platform for this project for several reasons:

“The Panther robot is an ideal choice for this project due to its size, four-wheel drive, and robust design, allowing it to cover large areas in complex hangar environments. Its ROS2 compatibility simplifies the integration of advanced systems like detection, pose estimation, and sensor fusion. Additionally, the Husarion team provided existing simulation packages, which made it easy to set up the Gazebo simulation environment quickly, allowing the project to start efficiently and focus on testing and development.” - Sakshi Chavan, M.Sc. in Robotics, Cranfield University.

-- Sakshi Chavan, M.Sc. in Robotics, Cranfield University.

The project's biggest challenges

However, the idea of combining image recognition from cameras faces some challenges:

  • Depth estimation with standard RGB cameras relies on pixel positioning and camera parameters. However, angled camera setups (e.g., at 30° or 60°) can distort depth perception, affecting Z-axis accuracy and pose estimation.
  • In complex environments like the aircraft hangar, a single camera may lose sight of the robot due to structural obstacles, creating blind spots that disrupt tracking. To maintain consistent localisation, the system requires multiple cameras that can switch views as needed to follow the robot’s movements.

pose estimation

Development of the project

The project has significantly progressed through extensive testing in simple and complex environments. Initially, the work focused on developing a YOLOv8-based detection system and refining sensor fusion for improved localisation in simulations. As testing expanded to the more intricate DARTeC hangar, new challenges like depth estimation, variable lighting, and obstacle avoidance emerged.

Despite performance improvements with sensor fusion, further algorithm and depth-sensing upgrades are needed to handle complex environments. Currently, the system operates smoothly in the Gazebo simulation and is being refined for depth perception and localisation accuracy to support real-world testing.

Actual coordinates vs Pose estimated

Figure: Actual coordinates vs Pose estimated

Next Steps

  1. Increasing diversity in Panther’s dataset to improve pose estimation by including images under varying conditions like different lighting and backgrounds
  2. Integration of stereo or depth cameras to improve depth perception, particularly for angled views.
  3. Deploying the system in actual hangar environments to assess practical performance and tackle unexpected challenges.

And what were the project results obtained by Sakshi’s university colleague, who used a slightly different approach to the hangar location problem?

Project #2 — Ultra-Wideband Integration

Accurate localisation is also essential for automated Maintenance, Repair, and Overhaul (MRO) tasks, where robots perform precise inspections. Rejin Jacob Vadukkoot (MSc in Robotics) conducted the research project to enhance navigation precision while integrating Ultra-Wideband (UWB) technology and advanced sensor fusion.

Panther UWB navigation

Panther serves as the primary platform for testing the sensor integration and navigation algorithms. UWB tag is attached to the mobile robot and integrated with traditional onboard sensors like LIDAR, IMUs, and odometers, all set within a simulated environment that closely replicates the scaled model of the Digital Aviation Research and Technology Centre hangar. Using Extended Kalman Filters (EKF) for sensor fusion, this setup enables centimeter-level accuracy and allows testing of complex navigation scenarios.

“Panther was chosen for our project for its versatility, robust sensor integration capabilities, and exceptional technical support in simulation and hardware contexts. Its strong support within the ROS community and the availability of constantly maintained and updated Gazebo simulation-based packages made it an ideal platform for our needs.”

-- Rejin Jacob Vadukkoot - MSc in Robotics, Cranfield University

info

Why does UWB make a difference?

Ultra-wideband is a radio technology that uses very short pulses of low-power radio waves across a wide frequency range. It operates over a wide bandwidth with minimal energy consumption and offers strong immunity to multipath interference, distinguishing direct from reflected signals. UWB's precise ranging works by measuring the time it takes for pulses to travel between devices, leveraging time-of-flight (ToF) calculations. This enables accurate distance measurements, often within a few centimeters, making it ideal for applications like indoor navigation and asset tracking. It has been utilized in industry automation for asset tracking, in healthcare for patient monitoring, and even in consumer electronics. In recent advancements, UWB has become increasingly valuable in mobile robotics because it provides centimeter-level accuracy and supports higher data rates.

Challenges and Solutions

The UWB simulation was created from scratch within the ROS and Gazebo framework.

  • Key challenge arises from managing the complexities of UWB signal processing in cluttered indoor environments, where interferences are common. Ongoing work involves accurately - modeling the covariances associated with UWB data to better reflect these challenging conditions.
  • It was also important to refine sensor fusion algorithms, utilizing Extended Kalman Filters (EKF) to effectively integrate the data from UWB with inputs from other sensors like LIDAR and IMUs.

Currently, the project has successfully demonstrated significant improvements in localisation accuracy within these simulations. Initial simulation tests have shown that integrating UWB with localisation information from the SLAM Toolbox—which utilizes a 2D SLAM (Simultaneous Localization and Mapping) based approach—results in a 30% improvement in localisation accuracy compared to using the SLAM Toolbox alone.

“Panther’s software support was incredibly beneficial. The platform provides extensive examples of integrating multiple sensors and hardware, which proved invaluable in integrating the UWB technology. The documentation quality also stood out, offering clear guidance that enhanced our team's ability to effectively implement complex sensor fusion algorithms. These capabilities ensured rapid prototyping, flexible experimentation with different sensors and algorithms, and consistent performance during our tests.”

-- Rejin Jacob Vadukkoot - MSc in Robotics, Cranfield University

UWB tag attached to the Panther Robot

Figure: UWB tag attached to the Panther Robot

Next steps

The project will progress from simulation-based testing to real-world trials in actual hangar environments aimed at validating sensor fusion algorithms and measuring overall system performance in dynamic, real-world conditions.

To further enhance robustness, the sensor suite is being expanded to fully utilize 3D LIDAR capabilities — previously limited to 2D—and now includes depth cameras. This upgrade calls for new algorithms to process 3D data, enabling significantly more precise navigation in complex environments.

Additionally, in the context of the project, an open-source UWB simulation within ROS 2 and Gazebo was developed, providing a foundation for further research and innovation in robotic navigation and sensor integration.

Whether it's image recognition or Ultra-Wideband integration, subsequent tests and improvements are showing promising results for the future use of both presented methods in practice.

Summary: Panther Platform Tackling Hangar Navigation Challenges

The Panther platform excels in addressing the unique challenges of aircraft hangar navigation, where GPS signals are absent and environments are large, reflective, and cluttered. At Cranfield University, two research projects demonstrated Panther’s capabilities. One uses visual tracking with deep learning to navigate featureless spaces, while the other integrates Ultra-Wideband (UWB) for centimeter-level accuracy despite interference and obstacles. Panther’s robust design, ROS 2 compatibility, and simulation tools make it an ideal solution for precise, reliable navigation in complex hangar environments.

The two research projects were the final theses of the students supervised by Amaka Adiuku, Angelos Plastropoulos & Gilbert Tang.

Panther for aircraft inspection

“The Panther robot is the ideal robotic platform for our learning needs. It is a robust and well-designed platform that enables students to gain experience working with an actual robot. With its high payload and numerous supported interfaces, Panther is perfect for experimenting with sensors in different scenarios and performing various tasks in our test facilities. The accompanying libraries and packages help students quickly grasp the requirements and avoid technical issues and incompatibilities. The supporting software is well-written and serves as an excellent example of how high-quality and tested packages should be developed. Husarion’s support is always available to assist us when we need that extra push to achieve the desired outcome. Working with Panther brings true joy to every roboticist, and this is the experience we aim to share with our students!”

-- Angelos Plastropoulos - Lab manager of the Automated Diagnostics & Machine Vision Lab, Cranfield University


If you believe that Panther could be equally useful in your use case, don’t hesitate to visit our store or contact us at contact@husarion.com. 🤖