open_iA: A tool for processing and visual analysis of industrial computed tomography datasets

Bernhard Fröhler, Johannes Weissenböck, Marcel Schiwarth, Johann Kastner, Christoph Heinzl

Research output: Contribution to journalArticlepeer-review

Abstract

open_iA is a platform for visual analysis and processing of volumetric datasets. Themain driver behind its development is to provide a common framework for performingvisual analytics on industrial Computed Tomography (CT) data. In contrast to gen-eral volume visualization or processing software, it offers specialized tools, which addressdomain-specific analysis scenarios such as porosity determination, fiber characterizationand image processing parameter space analysis. The wide range of building blocks, whichthese tools consist of, facilitate the development of new research prototypes in this appli-cation domain. It currently provides a variety of image processing filters, e.g. for noisereduction, segmentation, data type conversion, convolution, geometric transformations,and morphological operations. open_iA is written in C++ using Qt, VKT and ITK, aswell as some other open source libraries. open_iA is continuously improved and extended.It is open source and availableon GitHub. The core of open_iA provides functionality forloading and displaying volumetric datasets in several file formats, as well as support forloading polygonal datasets. A comparison of volumes is facilitated through a magic lensas well as optional position indicators in all open child windows. In addition, it providesa view for showing the image histogram, where also the transfer function used for theslicer views and the 3D renderer is configured
Original languageEnglish
Number of pages3
JournalJournal of Open Source Software
Volume4
Issue number35
DOIs
Publication statusPublished - Mar 2019

Keywords

  • computed tomography
  • visual analysis
  • Visualization

Fingerprint

Dive into the research topics of 'open_iA: A tool for processing and visual analysis of industrial computed tomography datasets'. Together they form a unique fingerprint.

Cite this