Optical High-Speed Rolling Mark Detection Using Object Detection and Levenshtein Distance

Gerald Zauner, Manuel Krammer

Research output: Contribution to journalArticlepeer-review


Featured Application: Railroad Infrastructure Detection. This paper presents an automated high-speed rolling mark recognition system for railroad rails utilizing image processing techniques. Rolling marks, which consist of numbers, letters, and special characters, were engraved into the rail web as 3D information. These rolling marks provide crucial details regarding the rail manufacturer, steel quality, year of production, and rail profile. As a result, they empower rail infrastructure managers to gain valuable insights into their infrastructure. The rolling marks were captured using a standard color camera under dark field illumination. The recognition of individual numbers, letters, and special characters was achieved through state-of-the-art deep neural network object detection, specifically employing the YOLO architecture. By leveraging reference rolling marks, the detected characters can then be accurately interpreted and corrected. This correction process involves calculating a weighted Levenshtein distance, ensuring that the system can identify and rectify partially misidentified rolling marks. Through the proposed system, the accurate and reliable identification of rolling marks was achieved, even in cases in which there were partial errors in the detection process. This novel system thus has the potential to substantially improve the management and maintenance of railroad infrastructure.

Original languageEnglish
Article number8678
JournalApplied Sciences (Switzerland)
Issue number15
Publication statusPublished - Aug 2023


  • railway infrastructure
  • rolling marks
  • computer vision
  • object detection
  • Levenshtein distance
  • YOLO—you only look once


Dive into the research topics of 'Optical High-Speed Rolling Mark Detection Using Object Detection and Levenshtein Distance'. Together they form a unique fingerprint.

Cite this