VISION - an open-source software for automated multi-dimensional image analysis of cellular biophysics

Florian Weber, Sofiia Iskrak, Franziska Ragaller, Jan Schlegel, Birgit Plochberger, Erdinc Sezgin, Luca A. Andronico

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Environment-sensitive probes are frequently used in spectral and multi-channel microscopy to study alterations in cell homeostasis. However, the few open-source packages available for processing of spectral images are limited in scope. Here, we present VISION, a stand-alone software based on Python for spectral analysis with improved applicability. In addition to classical intensity-based analysis, our software can batch-process multidimensional images with an advanced single-cell segmentation capability and apply user-defined mathematical operations on spectra to calculate biophysical and metabolic parameters of single cells. VISION allows for 3D and temporal mapping of properties such as membrane fluidity and mitochondrial potential. We demonstrate the broad applicability of VISION by applying it to study the effect of various drugs on cellular biophysical properties. the correlation between membrane fluidity and mitochondrial potential, protein distribution in cell-cell contacts and properties of nanodomains in cell-derived vesicles. Together with the code, we provide a graphical user interface for easy adoption.

Original languageEnglish
Article numberjcs262166
JournalJournal of Cell Science
Volume137
Issue number20
DOIs
Publication statusPublished - 15 Oct 2024

Keywords

  • Biophysical properties
  • Image analysis
  • Multi-dimension microscopy
  • Open source
  • Python
  • Spectral Imaging
  • Humans
  • Image Processing, Computer-Assisted/methods
  • Software
  • Biophysics/methods
  • Membrane Fluidity

Fingerprint

Dive into the research topics of 'VISION - an open-source software for automated multi-dimensional image analysis of cellular biophysics'. Together they form a unique fingerprint.

Cite this