The paper (https://t.co/HYLDzlcVey) shows results in symbolic regression (i.e. evolving equations to match data), text style-transfer (changing text sentiment), evolving Stable Diffusion prompts, and preliminary explorations of creating variation for Python code. pic.twitter.com/uBB1v9zPOm
— Carper (@carperai) February 24, 2023
Abstract of the linked paper:
This paper pursues the insight that language models naturally en- able an intelligent variation operator similar in spirit to evolutionary crossover. In particular, language models of sufficient scale demon- strate in-context learning, i.e. they can learn from associations between a small number of input patterns to generate outputs in- corporating such associations (also called few-shot prompting). This ability can be leveraged to form a simple but powerful variation op- erator, i.e. to prompt a language model with a few text-based geno- types (such as code, plain-text sentences, or equations), and to parse its corresponding output as those genotypes’ offspring. The promise of such language model crossover (which is simple to implement and can leverage many different open-source language models) is that it enables a simple mechanism to evolve semantically-rich text representations (with few domain-specific tweaks), and naturally benefits from current progress in language models. Experiments in this paper highlight the versatility of language-model crossover, through evolving binary bit-strings, sentences, equations, text-to- image prompts, and Python code. The conclusion is that language model crossover is a promising method for evolving genomes rep- resentable as text.
No comments:
Post a Comment