Imaging laser sensors - Final report for the period 2014-2016

Authors:

  • Per Jonsson
  • Markus Henriksson
  • Gustav Tolt
  • Christina Grönwall
  • Michael Tulldahl
  • Fredrik Bissmarck
  • Patrik Lif
  • Lars Sjökvist
  • Håkan Larsson

Publish date: 2017-03-16

Report number: FOI-R--4346--SE

Pages: 32

Written in: Swedish

Keywords:

  • 3D imaging
  • laser radar
  • UAVs
  • mapping
  • reconnaissance
  • target recognition

Abstract

This report summarises the results for the three-year project Imaging laser sensors that are part of the Swedish Armed Forces commissioning at FOI. The project lasted from 2014 to 2016. The project has examined the design and performance of future systems for laser-based 3D imaging for various military tasks and scenarios. Short summaries of the activities and research results are presented for different scenarios: long-range target recognition, LIDAR systems for monitoring, local situational awareness and documentation and tactical mapping and planning. The report also describes the interaction FOI has had with other organisations in the field and knowledgedissemination to the Swedish Armed Forces, FMV and other stakeholders. The report ends with a brief outlook on the future of imaging laser sensors and a summary. Some of the results that the project has demonstrated is the ability of a small multirotor UAV to gather 3D data with a laser scanner. The results of terrestrial and flying measurements with short range laser scanners show possible applications in diverse areas such as urban warfare and military bridge building. Furthermore, an experimental laser radar system based on photon counting showed very good resolution and penetration into the edge of the forest. This technology is not only suitable for longrange target recognition, but also the possibility of using laser radars for reconnaissance and surveillance has been identified during the project. In signal processing, new methods have been developed to improve the performance of 3D laser sensors on the moving platforms by utilising data from both inertial sensors and the previous 3D measurements in the series. Also, new methods for sensor planning (Next Best View) has been investigated and developed. The method allows for deciding where to measure the next time to best cover the uncovered areas. It allows for a more time-efficient measurement, and knowledge of which volume you covered with the previous measurements. The project has also conducted initial user testing to examine how 3D data should be presented to a user, and the demands on the 3D data for the user to be able to distinguish objects. In order to distinguish vehicles of dissimilar types a requirement of about 100 points on the target was identified, while about 1 000 measuring points are needed to distinguish more similar items, such as various models of cars. Some methods of presentation that enhance the ability to follow a sequence of events have been identified, such as indicate the direction of movement in the images. The technical development of imaging laser sensors makes the sensors become smaller, collect data with higher data rate and a longer range. This, combined with the development of both flying and terrestrial autonomous vehicles, will in the future bring new opportunities to collect 3D data quickly and safely at an affordable cost. This ability can bring major benefits through a better understanding of, for example, what is around the next corner or hill, where it is possible to advance or what the enemy is staging in preparation for an attack.