Local Optimization Often is Ill-conditioned in Genetic Programming for Symbolic Regression.

Research output: Chapter in Book/Report/Conference proceedingsConference contributionpeer-review

4 Citations (Scopus)

Abstract

Gradient-based local optimization has been shown to improve results of genetic programming (GP) for symbolic regression. Several state-of-the-art GP implementations use iterative nonlinear least squares (NLS) algorithms such as the Levenberg-Marquardt algorithm for local optimization. The effectiveness of NLS algorithms depends on appropriate scaling and conditioning of the optimization problem. This has so far been ignored in symbolic regression and GP literature. In this study we use a singular value decomposition of NLS Jacobian matrices to determine the numeric rank and the condition number. We perform experiments with a GP implementation and six different benchmark datasets. Our results show that rank-deficient and ill-conditioned Jacobian matrices occur frequently and for all datasets. The issue is less extreme when restricting GP tree size and when using many non-linear functions in the function set.
Original languageEnglish
Title of host publicationProceedings - 2022 24th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing, SYNASC 2022
EditorsBruno Buchberger, Mircea Marin, Viorel Negru, Daniela Zaharie
PublisherIEEE
Pages304-310
Number of pages7
ISBN (Electronic)978-1-6654-6545-8
ISBN (Print)978-1-6654-6546-5
DOIs
Publication statusPublished - 2023

Publication series

NameProceedings - 2022 24th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing, SYNASC 2022

Keywords

  • Evolutionary computing and genetic algorithms
  • Gradient methods
  • Least squares methods
  • Nonlinear approximation

Fingerprint

Dive into the research topics of 'Local Optimization Often is Ill-conditioned in Genetic Programming for Symbolic Regression.'. Together they form a unique fingerprint.

Cite this