Parameter identification for symbolic regression using nonlinear least squares

Research output: Contribution to journalArticlepeer-review

54 Citations (Scopus)


In this paper we analyze the effects of using nonlinear least squares for parameter identification of symbolic regression models and integrate it as local search mechanism in tree-based genetic programming. We employ the Levenberg–Marquardt algorithm for parameter optimization and calculate gradients via automatic differentiation. We provide examples where the parameter identification succeeds and fails and highlight its computational overhead. Using an extensive suite of symbolic regression benchmark problems we demonstrate the increased performance when incorporating nonlinear least squares within genetic programming. Our results are compared with recently published results obtained by several genetic programming variants and state of the art machine learning algorithms. Genetic programming with nonlinear least squares performs among the best on the defined benchmark suite and the local search can be easily integrated in different genetic programming algorithms as long as only differentiable functions are used within the models.

Original languageEnglish
Pages (from-to)471-501
Number of pages31
JournalGenetic Programming and Evolvable Machines
Issue number3
Publication statusPublished - 1 Sept 2020


  • Automatic differentiation
  • Genetic programming
  • Nonlinear least squares
  • Parameter identification
  • Symbolic regression


Dive into the research topics of 'Parameter identification for symbolic regression using nonlinear least squares'. Together they form a unique fingerprint.

Cite this