Revisiting Gradient-Based Local Search in Symbolic Regression

Publikation: Beitrag in Buch/Bericht/TagungsbandKapitelBegutachtung

Abstract

Gradient descent-based local search can dramatically improve solution performance in symbolic regression tasks, at the cost of significantly higher runtime as well as increased risks of overfitting. In this paper, we investigate exactly what amount of local search is really needed within the GP population. We show that low intensity local search is sufficient to boost the fitness of the entire population, provided that local search information in the form of optimized numerical parameters is written back into the genotype at least some of the time. Our results suggest that spontaneous adaptations (in the Lamarckian sense) act as evolutionary fuel for the Baldwin effect in genetic programming, and that in the absence of the former, the latter does not occur and evolution is hindered. The Lamarckian model works particularly well in symbolic regression, as local search only affects model coefficients and does not affect the inheritance of useful building blocks contained in the model structure.
OriginalspracheEnglisch
TitelGenetic Programming Theory and Practice
Redakteure/-innenStephan Winkler, Wolfgang Banzhaf, Ting Hu, Alexander Lalejini
Herausgeber (Verlag)Springer Nature
Kapitel13
Seiten259–273
Seitenumfang15
BandXXI
ISBN (elektronisch)978-981-96-0077-9
ISBN (Print)978-981-96-0076-2, 978-981-96-0079-3
DOIs
PublikationsstatusVeröffentlicht - 25 Feb. 2025
VeranstaltungGenetic Programming Theory and Practice - University of Michigan, Ann Arbor, USA/Vereinigte Staaten
Dauer: 6 Juni 20248 Juni 2024
http://gptp-workshop.com

Workshop

WorkshopGenetic Programming Theory and Practice
KurztitelGPTP 2025
Land/GebietUSA/Vereinigte Staaten
OrtAnn Arbor
Zeitraum06.06.202408.06.2024
Internetadresse

Fingerprint

Untersuchen Sie die Forschungsthemen von „Revisiting Gradient-Based Local Search in Symbolic Regression“. Zusammen bilden sie einen einzigartigen Fingerprint.

Zitieren