Airborne optical sectioning

Indrajit Kurmi, David C. Schedl, Oliver Bimber

Research output: Contribution to journalArticlepeer-review

24 Citations (Scopus)

Abstract

Drones are becoming increasingly popular for remote sensing of landscapes in archeology, cultural heritage, forestry, and other disciplines. They are more efficient than airplanes for capturing small areas, of up to several hundred square meters. LiDAR (light detection and ranging) and photogrammetry have been applied together with drones to achieve 3D reconstruction. With airborne optical sectioning (AOS), we present a radically different approach that is based on an old idea: synthetic aperture imaging. Rather than measuring, computing, and rendering 3D point clouds or triangulated 3D meshes, we apply image-based rendering for 3D visualization. In contrast to photogrammetry, AOS does not suffer from inaccurate correspondence matches and long processing times. It is cheaper than LiDAR, delivers surface color information, and has the potential to achieve high sampling resolutions. AOS samples the optical signal of wide synthetic apertures (30-100 m diameter) with unstructured video images recorded from a low-cost camera drone to support optical sectioning by image integration. The wide aperture signal results in a shallow depth of field and consequently in a strong blur of out-of-focus occluders, while images of points in focus remain clearly visible. Shifting focus computationally towards the ground allows optical slicing through dense occluder structures (such as leaves, tree branches, and coniferous trees), and discovery and inspection of concealed artifacts on the surface.

Original languageEnglish
Article number102
JournalJournal of Imaging
Volume4
Issue number8
DOIs
Publication statusPublished - Aug 2018
Externally publishedYes

Keywords

  • Computational imaging
  • Image-based rendering
  • Light fields
  • Synthetic apertures

Fingerprint

Dive into the research topics of 'Airborne optical sectioning'. Together they form a unique fingerprint.

Cite this