Vectorial Genetic Programming (GP) is a young branch of GP, where the training data for symbolic models not only include regular, scalar variables, but also allow vector variables. Also, the model’s abilities are extended to allow operations on vectors, where most vector operations are simply performed component-wise. Additionally, new aggregation functions are introduced that reduce vectors into scalars, allowing the model to extract information from vectors by itself, thus eliminating the need of prior feature engineering that is otherwise necessary for traditional GP to utilize vector data. And due to the white-box nature of symbolic models, the operations on vectors can be as easily interpreted as regular operations on scalars. In this paper, we extend the ideas of vectorial GP of previous authors, and propose a grammar-based approach for vectorial GP that can deal with various challenges noted. To evaluate grammar-based vectorial GP, we have designed new benchmark functions that contain both scalar and vector variables, and show that traditional GP falls short very quickly for certain scenarios. Grammar-based vectorial GP, however, is able to solve all presented benchmarks.
Original languageEnglish
Title of host publicationGenetic Programming Theory and Practice XVIII
Number of pages23
ISBN (Electronic)978-981-16-8113-4
ISBN (Print)978-981-16-8112-7, 978-981-16-8115-8
Publication statusPublished - 11 Feb 2022


Dive into the research topics of 'Grammar-Based Vectorial Genetic Programming for Symbolic Regression'. Together they form a unique fingerprint.

Cite this