Abstract
Gradient descent-based local search can dramatically improve solution performance in symbolic regression tasks, at the cost of significantly higher runtime as well as increased risks of overfitting. In this paper, we investigate exactly what amount of local search is really needed within the GP population. We show that low intensity local search is sufficient to boost the fitness of the entire population, provided that local search information in the form of optimized numerical parameters is written back into the genotype at least some of the time. Our results suggest that spontaneous adaptations (in the Lamarckian sense) act as evolutionary fuel for the Baldwin effect in genetic programming, and that in the absence of the former, the latter does not occur and evolution is hindered. The Lamarckian model works particularly well in symbolic regression, as local search only affects model coefficients and does not affect the inheritance of useful building blocks contained in the model structure.
| Originalsprache | Englisch |
|---|---|
| Titel | Genetic Programming Theory and Practice |
| Redakteure/-innen | Stephan Winkler, Wolfgang Banzhaf, Ting Hu, Alexander Lalejini |
| Herausgeber (Verlag) | Springer Nature |
| Kapitel | 13 |
| Seiten | 259–273 |
| Seitenumfang | 15 |
| Band | XXI |
| ISBN (elektronisch) | 978-981-96-0077-9 |
| ISBN (Print) | 978-981-96-0076-2, 978-981-96-0079-3 |
| DOIs | |
| Publikationsstatus | Veröffentlicht - 25 Feb. 2025 |
| Veranstaltung | Genetic Programming Theory and Practice - University of Michigan, Ann Arbor, USA/Vereinigte Staaten Dauer: 6 Juni 2024 → 8 Juni 2024 http://gptp-workshop.com |
Workshop
| Workshop | Genetic Programming Theory and Practice |
|---|---|
| Kurztitel | GPTP 2025 |
| Land/Gebiet | USA/Vereinigte Staaten |
| Ort | Ann Arbor |
| Zeitraum | 06.06.2024 → 08.06.2024 |
| Internetadresse |