1974-lem-cyberiad-trurlselectronicbard.pdf: “The First Sally (A), or, Trurl's Electronic Bard”, Stanisflaw Lem, Michael Kandel ( )
2020-case.pdf: “GPT-2 AI Poetry Generation: Writing like Donne”, Kaiya Case ( )
2020-elkins.pdf: “Can GPT–3 Pass a Writer’s Turing Test?”, (2020-09-14; ):
Until recently the field of natural language generation relied upon formalized grammar systems, small-scale statistical models, and lengthy sets of heuristic rules. This older technology was fairly limited and brittle: it could remix language into word salad poems or chat with humans within narrowly defined topics. Recently, very large-scale statistical language models have dramatically advanced the field, and GPT-3 is just one example. It can internalize the rules of language without explicit programming or rules. Instead, much like a human child, GPT-3 learns language through repeated exposure, albeit on a much larger scale. Without explicit rules, it can sometimes fail at the simplest of linguistic tasks, but it can also excel at more difficult ones like imitating an author or waxing philosophical.
2020-zitelli.pdf: “345M-GPT-2 After James Wright: Can AI Generate Convincing Contemporary Poetry?”, Jonah Zitelli ( )