Abstract
Gradient descent-based local search can dramatically improve solution performance in symbolic regression tasks, at the cost of significantly higher runtime as well as increased risks of overfitting. In this paper, we investigate exactly what amount of local search is really needed within the GP population. We show that low intensity local search is sufficient to boost the fitness of the entire population, provided that local search information in the form of optimized numerical parameters is written back into the genotype at least some of the time. Our results suggest that spontaneous adaptations (in the Lamarckian sense) act as evolutionary fuel for the Baldwin effect in genetic programming, and that in the absence of the former, the latter does not occur and evolution is hindered. The Lamarckian model works particularly well in symbolic regression, as local search only affects model coefficients and does not affect the inheritance of useful building blocks contained in the model structure.
Original language | English |
---|---|
Title of host publication | Genetic Programming Theory and Practice |
Editors | Stephan Winkler, Wolfgang Banzhaf, Ting Hu, Alexander Lalejini |
Publisher | Springer Nature |
Chapter | 13 |
Pages | 259–273 |
Number of pages | 15 |
Volume | XXI |
ISBN (Electronic) | 978-981-96-0077-9 |
ISBN (Print) | 978-981-96-0076-2, 978-981-96-0079-3 |
DOIs | |
Publication status | Published - 25 Feb 2025 |
Event | Genetic Programming Theory and Practice - University of Michigan, Ann Arbor, United States Duration: 6 Jun 2024 → 8 Jun 2024 http://gptp-workshop.com |
Workshop
Workshop | Genetic Programming Theory and Practice |
---|---|
Abbreviated title | GPTP 2025 |
Country/Territory | United States |
City | Ann Arbor |
Period | 06.06.2024 → 08.06.2024 |
Internet address |
Keywords
- genetic programming
- symbolic regression
- local search
- gradient descent
- evolvability