Generating poetry by example

Can computers learn to write poetry in particular styles? This is the research question addressed in "End-to-end Style-conditioned Poetry Generation", a paper published today in the proceedings of the 5th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature.

This joint work between OFAI's Tristan Miller and colleagues at Technische Universität Darmstadt and the University of Göttingen presents an end-to-end model for poetry generation based on conditioned recurrent neural network (RNN) language models. The goal is to learn stylistic features, including some associated with humorous poetry, from examples alone.

The LaTeCH-CLfL workshop, colocated with the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP), will be held on November 11, 2021 online and also live in Punta Cana, Dominican Republic.