Using Language Models in Causal Story Generation
Author(s)
Li, Siyan
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
Story generation remains a challenge because it is still difficult to automatically generate logically coherent yet natural stories. In this thesis, we propose an approach to this problem by combining our previous pipeline for story generation and the GPT-2 language model. This new architecture involves filtering generation results from GPT-2, and it outperforms the unfiltered GPT-2 model on tasks such as maintaining a single plotline and having events occurring in a sensible order.
Sponsor
Date
2020-12
Extent
Resource Type
Text
Resource Subtype
Undergraduate Thesis