GPT-3 Creative Fiction

Creative writing by OpenAI’s GPT-3 model, demonstrating poetry, dialogue, puns, literary parodies, and storytelling. Plus advice on effective GPT-3 prompt programming & avoiding common errors.
NN, fiction, GPT, poetry, humor, transhumanism
2020-06-192020-09-28 finished certainty: likely importance: 8


I con­tinue my AI po­etry gen­er­a­tion ex­per­i­ments with Ope­nAI’s 2020 GPT-3, which is 116× larg­er, and much more pow­er­ful, than the 2019 GPT-2. GPT-3, how­ev­er, is not merely a quan­ti­ta­tive tweak yield­ing “GPT-2 but bet­ter”—it is qual­i­ta­tively differ­ent, ex­hibit­ing eerie run­time learn­ing ca­pa­bil­i­ties al­low­ing even the raw mod­el, with zero fine­tun­ing, to “meta-learn” many tex­tual tasks purely by ex­am­ple or in­struc­tion. One does not train or pro­gram GPT-3 in a nor­mal way, but one en­gages in di­a­logue and writes prompts to teach GPT-3 what one wants.

Ex­per­i­ment­ing through the Ope­nAI Beta API in June 2020, I find that GPT-3 does not just match my fine­tuned GPT-2-1.5b-poetry for po­em-writ­ing qual­i­ty, but ex­ceeds it, while be­ing ver­sa­tile in han­dling po­etry, Tom Swifty puns, sci­ence fic­tion, di­a­logue like Tur­ing’s Tur­ing-test di­a­logue, lit­er­ary style par­o­dies… As the pièce de ré­sis­tance, I recre­ate Stanis­law Lem’s Cy­be­riad’s “Trurl’s Elec­tronic Bard” po­etry us­ing GPT-3. (A­long the way, I doc­u­ment in­stances of how the BPE text en­cod­ing un­nec­es­sar­ily dam­ages GPT-3’s per­for­mance on a va­ri­ety of tasks, how to best elicit the high­est-qual­ity re­spons­es, com­mon er­rors peo­ple make in us­ing GPT-3, and test out GPT-3’s im­prove­ments in NN weak points like logic or com­mon­sense knowl­edge.)

GPT-3’s sam­ples are not just close to hu­man lev­el: they are cre­ative, wit­ty, deep, meta, and often beau­ti­ful. They demon­strate an abil­ity to han­dle ab­strac­tions, like style par­o­dies, I have not seen in GPT-2 at all. Chat­ting with GPT-3 feels un­can­nily like chat­ting with a hu­man. I was im­pressed by the re­sults re­ported in the GPT-3 pa­per, and after spend­ing a week try­ing it out, I re­main im­pressed.

This page records GPT-3 sam­ples I gen­er­ated in my ex­plo­rations, and thoughts on how to use GPT-3 and its re­main­ing weak­nesses. I hope you en­joy them even a tenth as much as I en­joyed test­ing GPT-3 and watch­ing the com­ple­tions scroll across my screen.

The lat­est and great­est neural net­work for un­re­stricted nat­ural lan­guage gen­er­a­tion is Ope­nAI’s . GPT-3 is like and the I’ve used ex­ten­sively be­fore1—only much more so, and then go­ing be­yond them in a fas­ci­nat­ing new way.

Scal­ing works: quan­tity is a qual­ity all its own. The scal­ing of GPT-2-1.5b by 116× to GPT-3-175b has worked sur­pris­ingly well and un­locked re­mark­able flex­i­bil­ity in the form of meta-learn­ing, where GPT-3 can in­fer new pat­terns or tasks and fol­low in­struc­tions purely from text fed into it. What can we do with GPT-3? Here, we’re all about hav­ing fun while prob­ing GPT-3’s abil­i­ties for cre­ative writ­ing tasks, pri­mar­ily (but far from lim­ited to) po­et­ry. For­tu­nate­ly, Ope­nAI granted me ac­cess to their Beta API ser­vice which pro­vides a hosted GPT-3 mod­el, let­ting me spend a great deal of time in­ter­act­ing with GPT-3 and writ­ing things. Nat­u­ral­ly, I’d like to write po­etry with it: but GPT-3 is too big to fine­tune like I did GPT-2, and OA does­n’t (yet) sup­port any kind of train­ing through their API. Must we con­tent our­selves with mediocre generic po­et­ry, at best, de­prived of fine­tun­ing di­rectly on cho­sen po­etry cor­puses or au­thors we might like to par­o­dy? How much does GPT-3 im­prove and what can it do?

Turns out: a lot! Be­low, I walk through first im­pres­sions of us­ing GPT-3, and count­less sam­ples. In the lat­est twist on , GPT-3 still strug­gles with com­mon­sense rea­son­ing & fac­tual knowl­edge of the sort a hu­man finds effort­less after child­hood, but han­dles well things like satire & fic­tion writ­ing & po­et­ry, which we hu­mans find so diffi­cult & im­pres­sive even as adults. In ad­di­tion to the Cy­be­riad, I’d per­son­ally high­light the Navy Seal & Harry Pot­ter par­o­dies, the Dev­il’s Dic­tio­nary of Science/Academia, “Uber Poem”, “The Uni­verse Is a Glitch” poem (with AI-gen­er­ated rock mu­sic ver­sion), & “Where the Side­walk Ends”.

What Benchmarks Miss: Demos

The GPT-3 pa­per in­cludes eval­u­a­tion of zero-shot/few-shot per­for­mance across a wide range of tasks, but I fear that un­less one is fa­mil­iar with the (deadly dull) bench­marks in ques­tion, it won’t be im­pres­sive. You can skip to the ap­pen­dix for more ex­am­ple like its , or browse the ran­dom sam­ples.

The orig­i­nal in­cludes many strik­ing ex­am­ples of GPT-3 ca­pa­bil­i­ties rang­ing from chat­bots to ques­tion-based Wikipedia search to le­gal dis­cov­ery to home­work grad­ing to trans­la­tion; I’d high­light ‘s Dragon model (ex­am­ple), and “Spread­sheets”/“Nat­ural Lan­guage Shell”/“Code Com­ple­tion”2. An­drew Mayne de­scribes us­ing GPT-3 to gen­er­ate book rec­om­men­da­tion lists & read in­ter­ac­tive sto­ries & en­gage in con­ver­sa­tions with his­tor­i­cal fig­ures like Ada Lovelace3, sum­ma­rize texts for el­e­men­tary school chil­dren (also avail­able as a ser­vice now, Sim­pli­fy.so) or sum­ma­rize movies in emoji (Ma­trix: “🤖🤐”; Hunger Games: “🏹🥊🌽🏆”), con­vert screen­play ↔︎ story, summarize/write emails, and rewrite HTML. Paras Chopra finds that GPT-3 knows enough Wikipedia & other URLs that the ba­sic Q&A be­hav­ior can be aug­mented to in­clude a ’source’ URL, and so one can make a knowl­edge base ‘search en­gine’ with click­able links for any as­ser­tion (ie. the user can type in “What year was Richard Dawk­in’s The Selfish Gene pub­lished?” and GPT-3 will re­turn a tu­ple like ("The Selfish Gene was published in 1976","https://en.wikipedia.org/wiki/The_Selfish_Gene") which can be parsed & pre­sented as a search en­gine). An­dreas Stuhlmüller ex­plored us­ing it to cre­ate sug­ges­tions for pre­dict­ing on by break­ing down high­-level fore­cast­ing ques­tions. tests few-shot GPT-3 on com­mon moral rea­son­ing prob­lems, and while it does­n’t do nearly as well as a fine­tuned over­all, in­ter­est­ing­ly, its per­for­mance de­grades the least on the prob­lems con­structed to be hard­est.

ex­per­i­mented with Crunchy­roll anime, Star Trek: The Next Gen­er­a­tion, & Se­in­feld plot sum­maries. Max Woolf has a repo of GPT-3 ex­am­ple prompts & var­i­ous com­ple­tions such as the orig­i­nal GPT-2 “uni­corn” ar­ti­cle, Re­venge of the Sith, Stack Over­flow Python ques­tions, and his own tweets (note that many sam­ples are bad be­cause the prompts & hy­per­pa­ra­me­ters are often de­lib­er­ately bad, eg the tem­per­a­ture=0 sam­ples, to demon­strate the large effect of poor­ly-cho­sen set­tings as a warn­ing). Janelle Shan ex­per­i­mented with weird dog de­scrip­tions to ac­com­pany de­formed GAN-dog sam­ples, and 10,000-year nu­clear waste warn­ings based on the fa­mous on for the . Sum­mer­s-S­tay tried im­i­tat­ing Neil Gaiman & Terry Pratch­ett short sto­ries with ex­cel­lent re­sults. Ar­ram Sa­betti has done “songs, sto­ries, press re­leas­es, gui­tar tabs, in­ter­views, es­says, and tech­ni­cal man­u­als”, with his Elon Musk Dr. Seuss po­ems a par­tic­u­lar high­light. Paul Bel­low (LitRPG) ex­per­i­ments with RPG back­story gen­er­a­tion. Merz­men­sch Kos­mopol en­joyed gen­er­at­ing love let­ters writ­ten by a toast­er. co-wrote a SF Sin­gu­lar­ity short story with GPT-3, fea­tur­ing reg­u­lar meta where he & GPT-3 de­bate the story in­-char­ac­ter. Daniel Bigham plays what he dubs “19 ” which links Mon­go­lia to (even­tu­al­ly) Kevin Ba­con. Alexan­der Reben prompted for con­tem­po­rary art/sculpture de­scrip­tions, and phys­i­cally cre­ated some of the ones he liked best us­ing a va­ri­ety of medi­ums like match­sticks, toi­let plungers, keys, col­lage, etc.

Harley Tu­ran found that, some­how, GPT-3 can as­so­ciate plau­si­ble hex codes with spe­cific emo­ji. Even more per­plex­ing­ly, Sharif Shameem dis­cov­ered that GPT-3 could write (a Javascript+CSS hy­brid) ac­cord­ing to a spec­i­fi­ca­tion like “5 but­tons, each with a ran­dom color and num­ber be­tween 1–10” or increase/decrease a bal­ance in Re­act or a very sim­ple to-do list and it would often work, or re­quire rel­a­tively mi­nor fix­es. GPT-3 can also write some sim­ple SVG shapes or SVG/Chart.js bar graphs, do text→LaTeX and SQL queries. While I don’t think pro­gram­mers need worry about un­em­ploy­ment (NNs will be a com­ple­ment un­til they are so good they are a sub­sti­tute), the code demos are im­pres­sive in il­lus­trat­ing just how di­verse the skills cre­ated by pre­train­ing on the In­ter­net can be. Par­tic­u­larly in­trigu­ing in terms of code gen­er­a­tion is Jor­dan Singer’s Figma plu­gin which ap­par­ently cre­ates a new Figma lay­out DSL & few-shot teaches it to GPT-3.

(I’d also high­light GPT-3’s ver­sion of the fa­mous GPT-2 re­cy­cling rant, an at­tempt at “Epic Rap Bat­tles of His­tory”, GPT-3 play­ing 200-word table­top RPGs with it­self, the Serendip­ity rec­om­men­da­tion en­gine which asks GPT-3 for movie/book rec­om­men­da­tions, and Lawder’s food la­bel in­gre­di­ent sum­ma­rizer.)

One un­der­ex­plored area of GPT-3 is us­ing its “search” API, which as the name in­di­cates, takes a text prompt (the query) and searches a large set of pos­si­ble re­sults, and re­turns the ‘most sim­i­lar’ one, in a highly ab­stract sense; An­drew Mayne demon­strates that it’s much more than a sim­ple key­word search en­gine by do­ing things like search­ing for ab­stract movie plots.4

5: eg given the as the 7 pos­si­ble re­sults, which one does the query match? Re­sult: “Voy­age and Re­turn: The pro­tag­o­nist goes to a strange land and, after over­com­ing the threats it poses or learn­ing im­por­tant lessons unique to that lo­ca­tion, they re­turn with ex­pe­ri­ence.”

The search API, interestingly, doesn't use an embedding, as one might expect; while [iGPT](https://openai.com/blog/image-gpt/ "Image GPT: We find that, just as a large transformer model trained on language can generate coherent text, the same exact model trained on pixel sequences can generate coherent image completions and samples. By establishing a correlation between sample quality and image classification accuracy, we show that our best generative model also contains features competitive with top convolutional nets in the unsupervised setting.") demonstrates that GPTs *can* be used to create embeddings, it seems OA has not done so with GPT-3. It instead borrows a trick from algorithmic information theory: the best result is the one that when, appended to the query, compresses most easily/is predicted with the least error/has the biggest average logit.

GPT-3 Implications

For my main dis­cus­sion of why GPT-3 works and its im­pli­ca­tions, see (see also ). Be­low is the sum­ma­ry:

GPT-3, an­nounced by Ope­nAI in May 2020, is the largest neural net­work ever trained, by over an or­der of mag­ni­tude. Trained on In­ter­net text data, it is the suc­ces­sor to GPT-2, which sur­prised every­one by its nat­ural lan­guage un­der­stand­ing & gen­er­a­tion abil­i­ty. GPT-3 is even more sur­pris­ing in that this vast in­crease in size did not run into di­min­ish­ing re­turns, as many ex­pect­ed, but the ben­e­fits of scale con­tin­ued to hap­pen as fore­casted by Ope­nAI. These ben­e­fits were not merely learn­ing more facts & text than GPT-2, but qual­i­ta­tively dis­tinct & sur­pris­ing in show­ing meta-learn­ing: while GPT-2 learned how to do com­mon nat­ural lan­guage tasks like text sum­ma­riza­tion, GPT-3 in­stead learned how to fol­low di­rec­tions and learn new tasks from a few ex­am­ples. (As a re­sult, GPT-3 out­puts & in­ter­ac­tion are more fas­ci­nat­ing & hu­man-like than GPT-2.)

While the im­me­di­ate ap­pli­ca­tions of GPT-3, like my po­etry or hu­mor writ­ings, are nice, the short­-term im­pli­ca­tions of GPT-3 are much more im­por­tant.

First, while GPT-3 is ex­pen­sive by con­ven­tional DL stan­dards, it is cheap by scientific/commercial/military/government bud­get stan­dards, and the re­sults in­di­cate that mod­els could be made much larg­er. Sec­ond, mod­els can also be made much more pow­er­ful, as GPT is an old ap­proach known to be flawed in both mi­nor & ma­jor ways, and far from an ‘ideal’ Trans­former. Third, GPT-3’s ca­pa­bil­i­ties come from learn­ing on raw (un­su­per­vised) data; that has long been one of the weak­est ar­eas of DL, hold­ing back progress in other ar­eas like re­in­force­ment learn­ing or ro­bot­ics. Mod­els like GPT-3 sug­gest that large un­su­per­vised mod­els will be vi­tal com­po­nents of fu­ture DL sys­tems, as they can be ‘plugged into’ sys­tems to im­me­di­ately pro­vide un­der­stand­ing of the world, hu­mans, nat­ural lan­guage, and rea­son­ing.

The meta-learn­ing has a longer-term im­pli­ca­tion: it is a demon­stra­tion of the bless­ings of scale, where prob­lems with sim­ple neural net­works van­ish, and they be­come more pow­er­ful, more gen­er­al­iz­able, more hu­man-like when sim­ply made very large & trained on very large datasets with very large com­pute—even though those prop­er­ties are be­lieved to re­quire com­pli­cated ar­chi­tec­tures & fancy al­go­rithms (and this per­ceived need dri­ves much re­search). Un­su­per­vised mod­els ben­e­fit from this, as train­ing on large cor­puses like In­ter­net-s­cale text present a myr­iad of diffi­cult prob­lems to solve; this is enough to drive meta-learn­ing de­spite GPT not be­ing de­signed for meta-learn­ing in any way. (This fam­ily of phe­nom­ena is per­haps dri­ven by neural net­works func­tion­ing as en­sem­bles of many sub­-net­works with them all av­er­ag­ing out to an Oc­cam’s ra­zor, which for small data & mod­els, learn su­per­fi­cial or mem­o­rized parts of the data, but can be forced into true learn­ing by mak­ing the prob­lems hard & rich enough.)

The bless­ings of scale in turn sup­port a rad­i­cal the­o­ry: an old AI par­a­digm held by a few pi­o­neers in con­nec­tion­ism (early ar­ti­fi­cial neural net­work re­search) and by more re­cent deep learn­ing re­searchers, the scal­ing hy­poth­e­sis. The scal­ing hy­poth­e­sis re­gards the bless­ings of scale as the se­cret of AGI: in­tel­li­gence is ‘just’ sim­ple neural units & learn­ing al­go­rithms ap­plied to di­verse ex­pe­ri­ences at a (cur­rent­ly) un­reach­able scale. As in­creas­ing com­pu­ta­tional re­sources per­mit run­ning such al­go­rithms at the nec­es­sary scale, the neural net­works will get ever more in­tel­li­gent.

When? Es­ti­mates of Moore’s law-like progress curves decades ago by pi­o­neers like Hans Moravec in­di­cated that it would take un­til the 2010s for the suffi­cient­ly-cheap com­pute for tiny in­sec­t-level pro­to­type sys­tems to be avail­able, and the 2020s for the first sub­-hu­man sys­tems to be­come fea­si­ble, and these fore­casts are hold­ing up. (De­spite this vin­di­ca­tion, the scal­ing hy­poth­e­sis is so un­pop­u­lar an idea, and diffi­cult to prove in ad­vance rather than as a fait ac­com­pli, that while the GPT-3 re­sults fi­nally drew some pub­lic no­tice after Ope­nAI en­abled lim­ited pub­lic ac­cess & peo­ple could ex­per­i­ment with it live, it is un­likely that many en­ti­ties will mod­ify their re­search philoso­phies, much less kick off an ‘arms race’.)

De­pend­ing on what in­vest­ments are made into scal­ing DL, and how fast com­pute grows, the 2020s should be quite in­ter­est­ing—sig­moid or sin­gu­lar­i­ty?

Quality

Ob­jec­tive met­rics hard to in­ter­pret. How much bet­ter is (un-fine­tuned base) GPT-3? The like­li­hood loss is an ab­solute mea­sure, as are the bench­marks, but it’s hard to say what a de­crease of, say, 0.1 bits per char­ac­ter might mean, or a 5% im­prove­ment on SQuAD, in terms of re­al-world use or cre­ative fic­tion writ­ing. It feels like a large im­prove­ment, defi­nitely a larger im­prove­ment than go­ing from GPT-2-345M to GPT-2-1.5b, or GPT-2-1.5b to GPT-3-12b, but how much?

Screen­ing gains: 1:100 → 1:5 or 20× bet­ter? For fic­tion, I treat it as a cu­ra­tion prob­lem: how many sam­ples do I have to read to get one worth show­ing off? One could think of it ask­ing how effi­ciently a model searches (or should that be, , or “The Aleph”?): at the one ex­treme, an al­go­rithm which se­lects let­ters at ran­dom will have to gen­er­ate as­tro­nom­i­cally large num­bers of sam­ples be­fore, like the prover­bial mon­keys, they gen­er­ate a page from a Shake­speare play; at the other ex­treme, a rea­son­ably in­tel­li­gent hu­man can dash off 1 plau­si­ble page in 1 try. With AI al­go­rithms, the re­sults are in­ter­me­di­ate but rapidly im­prov­ing. A text gen­er­a­tor trained on a small cor­pus rep­re­sents a huge leap over ran­dom­ness: in­stead of hav­ing to gen­er­ate quadrillions of sam­ples, one might only have to gen­er­ate mil­lions of sam­ples to get a co­her­ent page; this can be im­proved to hun­dreds of thou­sands by in­creas­ing the depth of the n of its n-grams, which is fea­si­ble as one moves to In­ter­net-s­cale text datasets (the clas­sic ex­am­ple) or by care­ful hand-engi­neer­ing & com­bi­na­tion with other ap­proaches like Mad-Lib­s-esque tem­plat­ing. A char-RNN, like in my does bet­ter still: it eas­ily gen­er­ates rea­son­able para­graphs, so one might only have to brute force on the or­der of thou­sands of sam­ples to get a pleas­ing page. With GPT-2-117M po­et­ry, I’d typ­i­cally read through a few hun­dred sam­ples to get a good one, with worth­while im­prove­ments com­ing from 345M→774M→1.5b; by 1.5b, I’d say that for the , I read through 50–100 ‘po­ems’ to se­lect one. But for GPT-3, once the prompt is di­aled in, the ra­tio ap­pears to have dropped to closer to 1:5—­maybe even as low as 1:3! I fre­quently find my­self shrug­ging at the first com­ple­tion I gen­er­ate, “not bad!” (Cer­tain­ly, the qual­ity of GPT-3’s av­er­age prompted poem ap­pears to ex­ceed that of al­most all teenage po­et­s.) I would have to read GPT-2 out­puts for months and prob­a­bly sur­rep­ti­tiously edit sam­ples to­gether to get a dataset of sam­ples like this page.

Prompts As Programming

“On two oc­ca­sions I have been asked,—‘Pray, Mr. Bab­bage, if you put into the ma­chine wrong fig­ures, will the right an­swers come out?’ In one case a mem­ber of the Up­per, and in the other a mem­ber of the Low­er, House put this ques­tion. I am not able rightly to ap­pre­hend the kind of con­fu­sion of ideas that could pro­voke such a ques­tion.”

, Pas­sages from the Life of a Philoso­pher 1864

A new pro­gram­ming par­a­digm? The GPT-3 neural net­work is so large a model in terms of power and dataset that it ex­hibits qual­i­ta­tively differ­ent be­hav­ior: you do not ap­ply it to a fixed set of tasks which were in the train­ing dataset, re­quir­ing re­train­ing on ad­di­tional data if one wants to han­dle a new task (as one would have to re­train GPT-2); in­stead, you in­ter­act with it, ex­press­ing any task in terms of nat­ural lan­guage de­scrip­tions, re­quests, and ex­am­ples, tweak­ing the prompt un­til it “un­der­stands” & it meta-learns the new task based on the high­-level ab­strac­tions it learned from the pre­train­ing. This is a rather differ­ent way of us­ing a DL mod­el, and it’s bet­ter to think of it as a new kind of pro­gram­ming, where the prompt is now a “pro­gram” which pro­grams GPT-3 to do new things. “Prompt pro­gram­ming” is less like reg­u­lar pro­gram­ming than it is like coach­ing a su­per­in­tel­li­gent cat into learn­ing a new trick: you can ask it, and it will do the trick per­fectly some­times, which makes it all the more frus­trat­ing when it rolls over to lick its butt in­stead­—you know the prob­lem is not that it can’t but that it won’t.

Re­pro­gram­ming by ask­ing po­lite­ly. The demos above and on this page all6 use the raw de­fault GPT-3 mod­el, with­out any ad­di­tional train­ing. In­stead, to get all these differ­ent be­hav­iors, one pro­vides a short tex­tual in­put to GPT-3, with which it will pre­dict the next piece of text (as op­posed to start­ing with an empty in­put and freely gen­er­at­ing any­thing); GPT-3, just by read­ing it, can then flex­i­bly adapt its writ­ing style and rea­son­ing and use new de­fi­n­i­tions or rules or words de­fined in the tex­tual in­put no mat­ter that it has never seen them be­fore.

What is meta-learn­ing? This is con­sid­ered “meta-learn­ing” be­cause GPT-3 has “learned how to learn”: in its end­less train­ing on so many gi­ga­bytes of text, it en­coun­ters so many differ­ent kinds of text that it had no choice but to learn ab­strac­tions & how to un­der­stand de­scrip­tions & in­struc­tions & for­mat­ting & au­tho­r­ial in­tent to let it adapt on the fly to the cur­rent piece of text it was train­ing on, since there was too much di­ver­sity & data for it to sim­ply learn each task nor­mally by re­peated ex­po­sure—­much less mem­o­rize all the da­ta. At scale, for a suffi­ciently pow­er­ful (large) NN, the sim­plest & eas­i­est al­go­rithms to learn for bet­ter pre­dic­tion are ab­strac­tions & in­tel­li­gence: the harder and big­ger, the bet­ter. When GPT-3 meta-learns, the weights of the model do not change, but as the model com­putes layer by lay­er, the in­ter­nal num­bers be­come new ab­strac­tions which can carry out tasks it has never done be­fore; in a sense, the GPT-3 model with the 175b pa­ra­me­ters is not the real mod­el—the real model is those ephemeral num­bers which ex­ist in be­tween the in­put and the out­put, and de­fine a new GPT-3 tai­lored to the cur­rent piece of text. The real GPT-3 is not the fixed hard­wired weights, which merely are a boot­strap or a com­piler for cre­at­ing the real GPT-3, a new model cus­tomized to the data which ex­ists only briefly in the soft at­ten­tion weights dur­ing run­time, and may do com­pletely differ­ent things from the base­line mod­el.7

Few-shot learning/writing prompts: “Soft­ware 3.0”? (An­drej Karpa­thy, 2020-06-18)

Pro­gram­ming by di­a­logue? Be­cause you aren’t fine­tun­ing GPT-3 in the con­ven­tional way, in­ter­act­ing with GPT-3 via its few-shot learn­ing power takes on an en­tirely differ­ent feel­ing than any­thing else I’ve used be­fore. With reg­u­lar soft­ware, you have to think through ex­actly how to do some­thing; with deep learn­ing soft­ware, you have to fo­cus on pro­vid­ing data which in some way em­bod­ies the cor­rect an­swer which you want; but with GPT-3, you in­stead think about how to de­scribe what you want. With GPT-3, it helps to an­thro­po­mor­phize it: some­times you lit­er­ally just have to ask for what you want. (It can’t pos­si­bly be that easy, can it? Some­times, it is!) Thus, you can sim­ply ask it di­rectly in the Q&A for­mat: “what is X?” For ex­am­ple, if you want it to de­tect gib­ber­ish ques­tions and avoid try­ing to an­swer them and show some un­der­stand­ing of its un­cer­tainty, you can spec­ify in the prompt that it should­n’t an­swer non­sense ques­tions, and you can ask it to dou­ble-check an ear­lier an­swer; if you find it does­n’t seem to un­der­stand that a horse has two eyes or that a toaster weighs more than a pen­cil, per­haps ask­ing more ques­tions with bet­ter set­tings will fix that. Other times, you must in­stead think, “If a hu­man had al­ready writ­ten out what I want­ed, what would the first few sen­tences sound like? What would the in­tro­duc­tion and sum­mary sound like? What if I told a story here, how would that story start?” Thus, the sum­ma­riza­tion prompt: “My sec­ond grader asked me what this pas­sage means: …” Some tasks in the GPT-3 pa­per which showed dis­ap­point­ing per­for­mance can be im­proved dra­mat­i­cally by find­ing ap­pro­pri­ate for­mat­ting or prompts: arith­metic im­proves enor­mously with comma for­mat­ting of dec­i­mals (due to BPEs), and the “Word in Con­text” bench­mark, where GPT-3 sur­pris­ingly showed be­low-chance per­for­mance com­pared to the 85% SOTA, can be im­proved to >70% with bet­ter prompt­ing.

Sam­pling Can Prove The Pres­ence Of Knowl­edge But Not The Ab­sence

GPT-3 may “fail” if a prompt is poor­ly-writ­ten, does not in­clude enough ex­am­ples, or bad sam­pling set­tings are used. I have demon­strated this many times when some­one shows a “fail­ure” of GPT-3—the fail­ure was their own. The ques­tion is not whether a given prompt works, but .

Any child psy­chol­o­gist trained in ad­min­is­ter­ing IQ tests is well-aware of the need to build rap­port with chil­dren, to mon­i­tor them for prob­lems and gauge their lin­guis­tic skills: are they not a na­tive Eng­lish speak­er? Are they an­gry with or afraid of the psy­chol­o­gist? Are they ap­a­thetic and un­mo­ti­vat­ed? It is hard to ace an IQ test by ac­ci­dent, but it’s triv­ial to fail one on pur­pose; try­ing to ad­min­is­ter an IQ test to a child who has taken a dis­lik­ing to you is a waste of the time of every­one in­volved, and pre­sent­ing the re­sult­ing score as mean­ing­ful is pro­fes­sional mal­prac­tice.

The Lizard­man Con­stant: non­sense prompt com­ple­tions by hu­mans.

An­other cau­tion­ary ex­am­ple comes from sur­vey re­search. Re­searchers have demon­strated re­peat­edly in hu­man sur­veys that a cer­tain small per­cent­age of hu­man re­sponses will re­li­ably be bull­shit: “joke­ster” or “mis­chie­vous re­spon­ders”, or more mem­o­rably, re­spon­der­s—re­spon­dents who give the wrong an­swer to sim­ple ques­tions. These peo­ple are some un­an­a­lyz­able mix of lazy, stu­pid8, ig­no­rant, trolling, ‘jok­ing’, or just , pos­si­bly caus­ing . ( in falsely re­port­ing be­ing am­putees, adoptees, LGBT, binge-drinkers etc, and .)

Hu­man fail­ures of logic & com­mon sense, ex­am­ples. One in­ter­est­ing ex­am­ple: 75–96% of the al­most-mil­lion-strong and thought they had reg­is­tered as just “in­de­pen­dents”. Sci­ence ex­am­ples are well-known to demon­strate se­vere prob­lems with both knowl­edge and re­spons­es, as and so on; sim­i­lar­ly, in arith­metic, , with <20% of Amer­i­cans able to iden­tify the even num­bers in a list of 6, and ~10% of Amer­i­cans able to iden­tify which of 6 are prime (e­quiv­a­lent to ran­dom guess­ing—­for­get about mul­ti­ply­ing 4-digit num­ber­s!); and (and chil­dren have ); Pew’s finds 5% of athe­ists are “ab­solutely” or “fairly cer­tain” that they be­lieve in God (I’ll char­i­ta­bly pass over meat-eat­ing rates in vegans/vegetarians as a case of “the spirit is will­ing but the flesh is sweet”); in the , 14% of un­de­cided vot­ers said Hillary Clin­ton might be a de­mon, but they might vote for her; 2% of Clin­ton sup­port­ers said she was & they would; Scott Alexan­der men­tions the epony­mous 4% of re­spon­ders who say lizard­men rule the earth, but, to con­tinue the in­fer­nal the­me, notes also that 13% say Barack Obama is the An­tichrist (5% voted Oba­ma); while (and—the mon­ster­s—5% ap­prove of us­ing cell phones in movie the­ater­s); and per­haps traffick­ing with the ru­inous pow­ers ex­plains how and yet are still an­swer­ing sur­veys, with sa­tanic deals surely help­ing the (not that they ac­tu­ally use bleach, ). Such ex­am­ples could surely be mul­ti­plied (lit­er­ally ad nau­seam, even?). Cer­tain­ly, I (and Scott Alexan­der) see many bizarre re­sponses any time we are un­lucky enough to run a sur­vey which con­tains a free re­sponse field!

Sam­pling proves pres­ence but not ab­sence in hu­mans too… If chal­lenged on their ab­surd re­spons­es, they will dig their heels in and . Un­for­tu­nate­ly, there is no fool­proof rem­edy against lizard­man re­sponses (one can use ‘at­ten­tion checks’ and tests for over­claim­ing, like let­ting them en­dorse lists of items with fakes thrown in to catch the bull­shit­ter­s), be­cause hu­mans gonna hu­man. The re­al­ity is that hu­mans don’t an­swer ques­tions re­li­ably, ac­cu­rate­ly, or hon­estly even close to 100% of the time, and shame­lessly fail ‘com­mon sense’ or ‘logic’ or ‘arith­metic’ ques­tions all the time, re­quir­ing ex­ten­sive pre­cau­tions, care­ful sur­vey de­sign, and just throw­ing out a lot of data as garbage.

Hu­mans need prompt pro­gram­ming too. Should we con­clude from such cases that hu­mans, or at least some spe­cific hu­mans, are not ac­tu­ally in­tel­li­gent? No, of course not. We would say that such peo­ple have sim­ply not been prop­erly in­structed or ed­u­cat­ed, given in­cen­tive to be hon­est, or made nor­mal un­avoid­able er­rors. It would be ten­den­tious in the ex­treme to con­clude that be­cause some peo­ple will claim to have suffered fa­tal heart at­tacks that they are merely sta­tis­ti­cal pat­tern-match­ing ma­chines emit­ting plau­si­ble yet se­man­ti­cal­ly-null ut­ter­ances while pass­ing for hu­man; if we want to con­clude that, I hope we would probe them a lit­tle more thought­fully than prompt­ing them with some sur­vey items and de­clar­ing the case closed!

De­mand more from crit­ics. We should ex­pect noth­ing less of peo­ple test­ing GPT-3, when they claim to get a low score (much less stronger claims like “all lan­guage mod­els, present and fu­ture, are un­able to do X”): did they con­sider prob­lems with their prompt? Whether all of the hy­per­pa­ra­me­ters make sense for that task? Did they ex­am­ine where com­ple­tions go wrong, to get an idea of why GPT-3 is mak­ing er­rors? Did they test out a va­ri­ety of strate­gies? Did they con­sider qual­i­ta­tively how the failed com­ple­tions sound? (Or did they copy­-paste ar­bi­trary hy­per­pa­ra­me­ters, use the first prompt that came to mind, look at the out­put, and lazily present it to the world as proof of what GPT-3 can’t do?)

Ma­chine sym­pa­thy. Prompt pro­gram­ming often should be hu­man-like: if a hu­man would­n’t un­der­stand what was in­tend­ed, why would GPT-3? It’s not tele­pathic, and there are myr­i­ads of gen­res of hu­man text which the few words of the prompt could be­long to. (A help­ful thought ex­per­i­ment: if some­one emailed you a prompt out of the blue, with no other con­text what­so­ev­er, what would you in­ter­pret it as? A joke, a troll, spam, or what?) Prompts should obey —s­tate­ments should be true, in­for­ma­tive, and rel­e­vant. One should not throw in ir­rel­e­vant de­tails or non se­quiturs, be­cause in hu­man text, , that im­plies that those de­tails are rel­e­vant, no mat­ter how non­sen­si­cal a nar­ra­tive in­volv­ing them may be.9 When a given prompt is­n’t work­ing and GPT-3 keeps piv­ot­ing into other modes of com­ple­tion, that may mean that one has­n’t con­strained it enough by im­i­tat­ing a cor­rect out­put, and one needs to go fur­ther; writ­ing the first few words or sen­tence of the tar­get out­put may be nec­es­sary. (This was a par­tic­u­lar prob­lem with the lit­er­ary par­o­dies: GPT-3 would keep start­ing with it, but then switch in­to, say, one-liner re­views of fa­mous nov­els, or would start writ­ing fan­fic­tions, com­plete with self­-in­dul­gent pref­aces. The so­lu­tion was to write out the first 2 or 3 sen­tences of an ex­am­ple par­o­dy, and then GPT-3 would fin­ish out the par­o­dy, look back and see that there was an ex­am­ple of a lit­er­ary par­o­dy, and then hap­pily start gen­er­at­ing dozens of work­s+­par­ody pairs, once it fell into the groove.) The more nat­ural the prompt, like a ‘ti­tle’ or ‘in­tro­duc­tion’, the bet­ter; un­nat­u­ral-text tricks that were use­ful for GPT-2, like dump­ing in a bunch of key­words to try to steer it to­wards a top­ic, ap­pear less effec­tive or harm­ful with GPT-3.

Sur­pris­ingly pow­er­ful. Prompts are per­pet­u­ally sur­pris­ing—I kept un­der­es­ti­mat­ing what GPT-3 would do with a given prompt, and as a re­sult, I un­der­used it. Text is a weird way to try to in­put all these queries and out­put their re­sults or ex­am­ine what GPT-3 thinks (com­pared to a more nat­ural NLP ap­proach like us­ing BERT’s em­bed­dings), and fid­dly. Just as few peo­ple would have thought that you could get GPT-2 to au­to­mat­i­cally sum­ma­rize text by sim­ply ap­pend­ing a “TL;­DR:” string, few peo­ple would guess GPT-3 could write emoji sum­maries or that if you use a prompt like “Sum­ma­rize the plot of J.K. Rowl­ing’s Harry Pot­ter in the style of Ernest Hem­ing­way”, you might get out a dozen pro­fan­i­ty-laced re­views pan­ning 20th-cen­tury lit­er­a­ture (or a sum­ma­ry—in Chi­ne­se—of the Chi­nese trans­la­tion10), or that if you use a prompt like “Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence”, GPT-3 will gen­er­ate po­ems but then im­me­di­ately gen­er­ate ex­pla­na­tions of how neural net­works work & dis­cus­sions from em­i­nent re­searchers like Gary Mar­cus of why they will never be able to truly learn or ex­hibit cre­ativ­ity like gen­er­at­ing po­ems. It is diffi­cult to try out vari­a­tions on prompts be­cause as soon as the prompt works, it’s tempt­ing to keep try­ing out com­ple­tions to mar­vel at the sheer va­ri­ety and qual­ity as you are se­duced into fur­ther ex­plor­ing pos­si­bil­i­ty-space. (GPT-3 never grows im­pa­tient or bored.) What other ca­pa­bil­i­ties are la­tent, wait­ing to be ex­posed by some­one stum­bling across the right prompt?

(Of course, not all these ca­pa­bil­i­ties are nec­es­sar­ily de­sir­able: where there is pro­gram­ming, you can be sure there is hack­ing. Where there is “prompt pro­gram­ming”, there must be “prompt hack­ing”… GPT-3 can fol­low in­struc­tions, so within its con­tex­t-win­dow or with any ex­ter­nal mem­o­ry, it is surely Tur­ing-com­plete, and who knows what or are pos­si­ble? Con­sider the AI Dun­geon users as an early ex­am­ple of “prompt hack­ing”.)

Finetuning

Fine­tun­ing was nec­es­sary to ‘pro­gram’ GPT-2. GPT-3’s “prompt pro­gram­ming” par­a­digm is strik­ingly differ­ent from GPT-2, where its prompts were brit­tle and you could only tap into what you were sure were ex­tremely com­mon kinds of writ­ing, and, as like as not, it would quickly change its mind and go off writ­ing some­thing else. At best, you could fairly gener­i­cally hint at a topic to try to at least get it to use key­words; then you would have to fil­ter through quite a few sam­ples to get one that re­ally wowed you. (This was a trick I used for TWDNE to get it to gen­er­ate at least vaguely ani­me-re­lated plot sum­maries.) To get out­put re­li­ably out of GPT-2, you had to fine­tune it on a prefer­ably de­cen­t-sized cor­pus.

Do we need fine­tun­ing given GPT-3’s prompt­ing? But with GPT-3, you can just say so, and odds are good that it can do what you ask, and al­ready knows what you’d fine­tune it on. (For ex­am­ple, I thought I would have to fine­tune GPT-3 to get sam­ples of my­self, since GPT-2 does­n’t know any­thing about “Gw­ern”/“gw­ern.net”; but it turns out, all I have to do is put in “A new es­say by Gw­ern Bran­wen (g­w­ern.net):” and out comes an un­canny sim­u­lacrum of my­self, or Scott Alexan­der, or Paul Gra­ham, or…) Would it be bet­ter if fine­tuned? In­du­bitably. But it’s not nec­es­sary. And given the cre­ativ­ity of the non-fine­tuned GPT-3, I’m not sure that I even want to—and for­feit all the be­hav­iors I haven’t yet dis­cov­ered‽

As of mid-June 2020, the Ope­nAI API does not sup­port fine­tun­ing al­though OA was work­ing on it. But after enough time play­ing with GPT-3, I have be­gun to won­der: at this level of meta-learn­ing & gen­eral knowl­edge, do we need fine­tun­ing at all?

For GPT-2, I saw fine­tun­ing as do­ing 2 things:

  1. Fix­ing ig­no­rance: miss­ing do­main knowl­edge

    GPT-2 did­n’t know many things about most things—it was just a hand­ful (1.5 bil­lion) of pa­ra­me­ters trained briefly on the tini­est frac­tion of the Com­mon Crawl sub­set of the In­ter­net, with­out any books even11. It’s not sur­pris­ing that for many do­mains, it would­n’t know the de­tails; and even if the dataset in­cluded ad­e­quate text, it did not train on that data many times, and the knowl­edge com­peted with all the other do­mains it needed to know about, in­ter­fer­ing.

    But GPT-3 al­ready knows every­thing! GPT-3 is so much larger on every di­men­sion that this seems like much less of a prob­lem for any do­main which is al­ready well-rep­re­sented in pub­lic HTML pages. GPT-2 might need to be trained on a fan­fic­tion cor­pus to learn about some ob­scure char­ac­ter in a ran­dom me­dia fran­chise & gen­er­ate good fic­tion, but GPT-3 al­ready knows about them and use them ap­pro­pri­ately in writ­ing new fic­tion.

  2. Prompt­ing a spe­cific task:

    Even when GPT-2 knew a do­main ad­e­quate­ly, it had the frus­trat­ing be­hav­ior of rapidly switch­ing do­mains. You might prompt it with a poem genre it knows ad­e­quately al­ready, but then after a few lines, it would gen­er­ate an end-of-text BPE and switch to gen­er­at­ing a news ar­ti­cle on Don­ald Trump. (Trump shows up a lot.) Pre­sum­ably, while po­etry was rea­son­ably rep­re­sent­ed, it was still rare enough that GPT-2 con­sid­ered po­etry highly un­likely to be the next word, and keeps try­ing to jump to some more com­mon & likely kind of text, and GPT-2 is not smart enough to in­fer & re­spect the in­tent of the prompt.

    GPT-3 ex­hibits much less of this ‘mode switch­ing’ sort of be­hav­ior. Per­haps be­cause it is trained on a much larger and more com­pre­hen­sive dataset (so news ar­ti­cles aren’t so dom­i­nan­t), but also I sus­pect the meta-learn­ing makes it much bet­ter at stay­ing on track and in­fer­ring the in­tent of the promp­t—hence things like the “Trans­former po­etry” prompt, where de­spite be­ing what must be highly un­usual text, even when switch­ing to prose, it is able to im­pro­vise ap­pro­pri­ate fol­lowup com­men­tary.

    Nev­er­the­less, some­times we can’t or don’t want to rely on prompt pro­gram­ming. A spe­cific task may be nec­es­sary when a task has evaded our prompt pro­gram­ming skills, or we have data but not prompt pro­gram­mer time. For ex­am­ple, in the GPT-3 pa­per, many tasks un­der­per­form what GPT-3 can do if we take the time to tai­lor the prompts & sam­pling hy­per­pa­ra­me­ters, and just throw­ing the naive prompt for­mat­ting at GPT-3 is mis­lead­ing. How­ev­er, re­searchers do not have the time to go through scores of bench­mark tasks and fix them one by one; sim­ply fine­tun­ing on them col­lec­tively ought to do at least as well as the cor­rect prompts would, and re­quires much less hu­man effort (al­beit more in­fra­struc­ture).

So, what would be the point of fine­tun­ing GPT-3 on po­etry or lit­er­a­ture? It has likely al­ready seen the fine­tun­ing cor­pus, knows most of it, and will tractably gen­er­ate po­ems on de­mand. There may be gains, but I won­der if they would be nearly as large as they were for GPT-2?

Playground

All of the fol­low­ing sam­ples were gen­er­ated us­ing the Ope­nAI Beta Play­ground, which looks like this:

OA API Beta Play­ground UI & avail­able prewrit­ten prompts/sampling op­tions

The Play­ground has some rough edges in Be­ta, and ca­pac­ity is­sues. A good way to start is to gen­er­ate sam­ples with the log probs/logits turned on, and pay­ing at­ten­tion to how sam­pling hy­per­pa­ra­me­ters affect out­put, to gain in­tu­ition for how GPT-3 thinks & what sam­ples looks like when sam­pling goes hay­wire.

The qual­ity vs di­ver­sity trade­off for top-k/nucleus sam­pling on GPT-2 news ar­ti­cles: more ex­treme set­tings like top-k = 10 / top_p = 0.6 are equally good to get the high­est hu­man rat­ings—but both come at at the ex­pense of va­ri­ety of pos­si­ble com­ple­tions. (; see also )

Trade­off: di­ver­sity vs ac­cu­ra­cy. It offers the stan­dard sam­pling op­tions fa­mil­iar from ear­lier GPT-2 in­ter­faces, in­clud­ing . One par­tic­u­larly ma­nip­u­lates the tem­per­a­ture set­ting to bias to­wards wilder or more pre­dictable com­ple­tions; for fic­tion, where cre­ativ­ity is para­mount, it is best set high, per­haps as high as 1, but if one is try­ing to ex­tract things which can be right or wrong, like ques­tion-an­swer­ing, it’s bet­ter to set it low to en­sure it prefers the most likely com­ple­tion. (After all, the point of a high tem­per­a­ture is to reg­u­larly se­lect com­ple­tions which the model thinks aren’t like­ly; why would you do that if you are try­ing to get out a cor­rect arith­metic or trivia ques­tion an­swer?) For top_p, one can set it to ~0.95 and largely for­get about it un­less one sus­pects that it’s break­ing an­swers like top-k and it needs to be much low­er, like 0.5; it’s there to cut off the tail of gib­ber­ish com­ple­tions and re­duce rep­e­ti­tion, so does­n’t affect the cre­ativ­ity too much. I gen­er­ally avoid the use of the rep­e­ti­tion penal­ties be­cause I feel rep­e­ti­tion is crit­i­cal to cre­ative fic­tion, and I’d rather err on the side of too much than too lit­tle, but some­times they are a use­ful in­ter­ven­tion; GPT-3, sad to say, main­tains some of the weak­nesses of GPT-2 and other like­li­hood-trained au­tore­gres­sive se­quence mod­els, such as the propen­sity to fall into de­gen­er­ate rep­e­ti­tion.

Rank­ing fi­nal re­sults for qual­ity gain. A lit­tle more un­usu­al­ly, it offers a “best of” (BO) op­tion which is the rank­ing trick (other names in­clude or “ran­dom-sam­pling shoot­ing method”: gen­er­ate n pos­si­ble com­ple­tions in­de­pen­dent­ly, and then pick the one with best to­tal like­li­hood, which avoids the de­gen­er­a­tion that an ex­plicit tree/beam search would un­for­tu­nately trig­ger, as doc­u­mented most re­cently by the & re­ported by many oth­ers about like­li­hood-trained text mod­els in the past eg char-RNN in 2015, , or 12). I’m not sure how to best use BO: it seems to be highly help­ful for things with one right an­swer (such as tricky Q&A or rea­son­ing), but when it helps with ‘cre­ative’ com­ple­tions is less clear. I tried out BO heav­ily be­cause I could­n’t quite fig­ure out how it in­ter­acts with qual­i­ty. On the smaller mod­els, it seems to help boost qual­ity up to­wards ‘davinci’ (GPT-3-175b) lev­els with­out caus­ing too much trou­ble, but on davin­ci, it ap­pears to ex­ac­er­bate the usual sam­pling is­sues: par­tic­u­larly with po­et­ry, it’s easy for a GPT to fall into rep­e­ti­tion traps or loops, or spit out mem­o­rized po­ems, and BO makes that much more like­ly. For gen­er­at­ing com­ple­tions of fa­mous po­ems, it’s quite hard to get GPT-3 to gen­er­ate new ver­sions un­less you ac­tively edit the poem to force a differ­ence. (In the most ex­treme case, in the case of gen­er­at­ing new vari­a­tions on “Jab­ber­wocky”, I have been un­able to gen­er­ate any new ver­sions un­der any set­ting, even tak­ing the step of ag­gres­sively edit­ing in new lines about how the vor­pal sword bounced off the Jab­ber­wocky and it won… It al­ways spits out chunks of the orig­i­nal.13) So BO is a dou­ble-edged sword. The best way I found to use it is to sam­ple with­out it (BO=1) at max temp, and then once it has sev­eral dis­tinctly differ­ent lines, then sam­pling with more (eg BO=5) seems to help rather than hurt. This is a lit­tle sur­pris­ing to me be­cause for Meena, it made a large differ­ence to do even a lit­tle BO, and while it had di­min­ish­ing re­turns, I don’t think there was any point they tested where higher best-of-s made re­sponses ac­tu­ally much worse (as op­posed to merely n times more ex­pen­sive). Pos­si­bly BO is much more use­ful for nonfiction/information-processing tasks, where there’s one cor­rect an­swer and BO can help over­come er­rors in­tro­duced by sam­pling or my­opia.

Effective Prompt Programming

“To con­strain the be­hav­ior of a pro­gram pre­cisely to a range may be very hard, just as a writer will need some skill to ex­press just a cer­tain de­gree of am­bi­gu­i­ty. A com­puter is like a vi­o­lin. You can imag­ine a novice try­ing first a phono­graph and then a vi­o­lin. The lat­ter, he says, sounds ter­ri­ble. That is the ar­gu­ment we have heard from our hu­man­ists and most of our com­puter sci­en­tists. Com­puter pro­grams are good, they say, for par­tic­u­lar pur­pos­es, but they aren’t flex­i­ble. Nei­ther is a vi­o­lin, or a type­writer, un­til you learn how to use it.”

, “Why Pro­gram­ming Is a Good Medium for Ex­press­ing Poor­ly-Un­der­stood and Slop­pi­ly-For­mu­lated Ideas” 1967

An­thro­po­mor­phize your prompts. There is no sub­sti­tute for test­ing out a num­ber of prompts to see what differ­ent com­ple­tions they elicit and to re­verse-engi­neer what kind of text GPT-3 “thinks” a prompt came from, which may not be what you in­tend and as­sume (after all, GPT-3 just sees the few words of the promp­t—it’s no more a telepath than you are). If you ask it a ques­tion to test its com­mon­sense rea­son­ing like “how many eyes does a horse have” and it starts com­plet­ing with a knock­-knock joke, you need to re­think your prompt! Does it spit out com­ple­tions that look like it’s think­ing but it’s ex­e­cut­ing the wrong al­go­rithm, or it falls back to copy­ing parts of the in­put? Then one may need to few-shot it by pro­vid­ing ex­am­ples to guide it to one of sev­eral pos­si­ble things to do. One should also keep in mind the im­por­tance of sam­pling pa­ra­me­ters, and whether one is look­ing for a sin­gle cor­rect an­swer (so low temp with BO=1 if com­pute-lim­it­ed, or high temp and BO=20 if pos­si­ble) or if one is try­ing for cre­ative an­swers (high temp with rep­e­ti­tion penalties).

The 4 Horse­men: short con­text, bad prompts, BPEs, ran­dom sam­pling. My rule of thumb when deal­ing with GPT-3 is that if it is mess­ing up, the er­rors are usu­ally at­trib­ut­able to one of 4 prob­lems: too-short con­text win­dows, in­suffi­cient prompt en­gi­neer­ing, BPE en­cod­ing mak­ing GPT-3 ‘blind’ to what it needs to see to un­der­stand & solve a prob­lem, or noisy sam­pling sab­o­tag­ing GPT-3’s at­tempts to show what it knows. An­other use­ful heuris­tic is to try to ex­press some­thing as a mul­ti­-step rea­son­ing process, such as a di­a­logue: be­cause GPT-3 is a feed­for­ward NN, it can only solve tasks which fit within one “step” or for­ward pass; any given prob­lem may be too in­her­ently se­r­ial for GPT-3 to have enough ‘think­ing time’ to solve it, even if it can suc­cess­fully solve each in­ter­me­di­ate sub­-prob­lem within a step. So peo­ple have demon­strated that GPT-3 won’t solve a sim­ple math prob­lem in a sin­gle step, but it will solve it if you re­frame it as a ‘di­a­logue’ with an —who knew neural net­work re­search would lead to anime wolf­girl de­monolo­gy?—and even ask it to guess-and-check or brute-force the an­swer; one can also ex­per­i­ment in coach­ing it through ex­am­ples14, or ask­ing it about pre­vi­ous an­swers or its un­cer­tain­ty. This makes sense if we think of Trans­form­ers as un­rolled RNNs which un­for­tu­nately lack a hid­den state: se­ri­al­iz­ing out the rea­son­ing helps over­come that com­pu­ta­tional lim­i­ta­tion.

Log­prob de­bug­ging. GPT-3 does not di­rectly emit text, but it in­stead pre­dicts the prob­a­bil­ity (or “like­li­hood”) of the 51k pos­si­ble BPEs given a text; in­stead of merely feed­ing them into some ran­dom­ized sam­pling process like tem­per­a­ture top-k/top_p sam­pling, one can also record the pre­dicted prob­a­bil­ity of each BPE con­di­tional on all the pre­vi­ous BPEs. This gives you a sim­ple idea of what GPT-3 is think­ing about each BPE: is it likely or un­likely (given the pre­vi­ous BPEs)? Which BPEs are es­pe­cially un­like­ly? Does it “get it” as the com­ple­tion goes on? I don’t use log­probs much but I gen­er­ally use them in 1 of 3 ways: I use them to see if the prompt ‘looks weird’ to GPT-3; to see where in a com­ple­tion it ‘goes off the rails’ (sug­gest­ing the need for lower temperatures/top_p or higher BO); and to peek at pos­si­ble com­ple­tions to see how un­cer­tain it is about the right an­swer—a good ex­am­ple of that is Ar­ram Sa­beti’s un­cer­tainty prompts in­ves­ti­ga­tion where the log­probs of each pos­si­ble com­ple­tion gives you an idea of how well the un­cer­tainty prompts are work­ing in get­ting GPT-3 to put weight on the right an­swer, or in my par­ity analy­sis where I ob­served that the log­probs of 0 vs 1 were al­most ex­actly 50:50 no mat­ter how many sam­ples I added, show­ing no trace what­so­ever of few-shot learn­ing hap­pen­ing. Thus, log­probs can offer more in­sight while de­bug­ging a prompt than just re­peat­edly hit­ting ‘com­plete’ and get­ting frus­trat­ed.

AI Dun­geon ≠ GPT-3

While a neat trick & big up­grade over pub­lic GPT-2 mod­els, AID is not an un­re­stricted ful­l-power GPT-3caveat emp­tor!

AI Dun­geon < GPT-3. For peo­ple us­ing the AI Dun­geon (AID) route, things are tricky be­cause AID users don’t have the same sam­pling op­tions that API users do (no best-of is par­tic­u­larly painful when try­ing to elicit cor­rect an­swers to hard ques­tion­s), and no con­trol over the full prompt/history, with AID do­ing lots of things be­hind the scenes on a model that may have been fine­tuned on RPG-like ma­te­r­ial & count­less AID game tran­scripts etc, and with qual­ity of model com­pletely out of their hands (does choos­ing “cus­tom” get you Drag­on, or do you have to choose a differ­ent mode & edit it? the nec­es­sary trick seems to change over time), with oc­ca­sional dras­tic qual­ity drops re­ported by many AID users when… some­thing changes on the back­end. For ex­am­ple, if you are an AID user, were you aware that the first re­sponse for a cus­tom prompt is ac­tu­ally al­ways GPT-2, to try to block back­door GPT-3 ac­cess? Or that “We cut off the gen­er­a­tion at cer­tain points (trail­ing sen­tences etc…) Dis­able cer­tain to­kens to im­prove per­for­mance or make gen­er­a­tion safer, fine-tune on text ad­ven­tures and only use the last ~1000 to­kens of con­text.” A cau­tion­ary ex­am­ple of AID use comes from Gary Mar­cus & Ernest Davis’s use: they fil­tered a large num­ber of ques­tions through AID to try to find cases GPT-3 would fail on; how­ev­er, when the AID fail­ure cases were tested on GPT-3 by Dou­glas Sum­mer­s-S­tay, it solved half of them! (AID is de­signed to pro­duce fun text ad­ven­tures, not be a NLP test­bed, and that shows when one tries to use AID as a back­door to GPT-3.) To work around this, AID users seem to need to warm up ses­sions care­fully with de­scrip­tive prompts/interactions to over­come the gam­i­fi­ca­tion, and avoid any­thing that might veer back into com­edy or dra­ma.

Only once these have been ruled out do I start con­sid­er­ing al­ter­na­tive ex­pla­na­tions like “lan­guage mod­els will never solve X”.

Weaknesses

Lim­ited mem­o­ry, repetition/divergence, BPE en­cod­ing. GPT-3 is, of course, not per­fect. We should keep that in mind when eval­u­at­ing it. As a scaled-up GPT-2, it has mostly the same weak­ness­es, and my thoughts on im­prove­ments re­main mostly the same (a­side from mov­ing away from BPEs, which need is be­com­ing in­creas­ingly ur­gent; see the next sec­tion).

Ar­ti­fi­cial in­tel­li­gence pro­grams like deep learn­ing neural net­works may be able to beat hu­mans at play­ing Go or chess, or do­ing arith­metic, or writ­ing Navy Seal copy­pas­ta, but they will never be able to truly think for them­selves, to have con­scious­ness, to feel any of the rich­ness and com­plex­ity of the world that we mere hu­mans can feel. Mere, un­en­light­ened hu­mans might be im­pressed by the abil­i­ties of sim­ple deep learn­ing pro­grams, but when looked at in a more holis­tic man­ner, it all adds up to… well, noth­ing. They still don’t ex­hibit any trace of con­scious­ness. All of the avail­able data sup­port the no­tion that hu­mans feel and ex­pe­ri­ence the world differ­ently than com­put­ers do. While a com­puter can beat a hu­man mas­ter at chess or Go or some other game of struc­tured rules, it will never be able to truly think out­side of those rules, it will never be able to come up with its own new strate­gies on the fly, it will never be able to feel, to re­act, the way a hu­man can. Ar­ti­fi­cial in­tel­li­gence pro­grams lack con­scious­ness and self­-aware­ness. They will never be able to have a sense of hu­mor. They will never be able to ap­pre­ci­ate art, or beau­ty, or love. They will never feel lone­ly. They will never have em­pa­thy for other peo­ple, for an­i­mals, for the en­vi­ron­ment. They will never en­joy mu­sic or fall in love, or cry at the drop of a hat. Merely by ex­ist­ing, mere, un­en­light­ened hu­mans are in­tel­lec­tu­ally su­pe­rior to com­put­ers, no mat­ter how good our com­put­ers get at win­ning games like Go or Jeop­ardy. We don’t live by the rules of those games. Our minds are much, much big­ger than that.

Wait, I’m sor­ry—that pre­ced­ing para­graph on the weak­nesses of deep learn­ing was ac­tu­ally writ­ten by GPT-3, and is in the wrong sec­tion. (Man­age­ment re­grets the mis­take.) But se­ri­ous­ly, what weak­nesses does GPT-3 have?

Small Context Window

No mem­ory (fix­able). The first limit is that it re­mains hob­bled by the lim­ited con­text win­dow. GPT-3 has no form of mem­ory or re­cur­rence, so it can­not see any­thing out­side its lim­ited 2048 BPEs (rough­ly, 500–1000 word­s). This means it can­not hope to write any­thing of any se­ri­ous length, be­cause the be­gin­ning will soon van­ish over the event hori­zon, and it also lim­its its abil­ity to en­gage in few-shot learn­ing, for the same rea­son: the promp­t+­gen­er­a­tion will quickly ex­ceed the win­dow length. While the dam­age may be lim­ited for tasks where the for­mat is repet­i­tive, like Q&A (so GPT-3 can do the nec­es­sary meta-learn­ing over its com­ple­tions just as well as over the orig­i­nal promp­t), this does limit it and is frus­trat­ing.

Repetition/Divergence Sampling

Repetition/gibberish (mys­tery). Au­tore­gres­sive lan­guage mod­els trained by like­li­hood (pre­dic­tion) loss all share an ex­tremely an­noy­ing prob­lem: when you gen­er­ate free-form com­ple­tions, they have a ten­dency to even­tu­ally fall into repet­i­tive loops of gib­ber­ish. Whether GPT-2 or T5 or etc, they all seem to do it, and if one tries to avoid such ex­tremely dumb & crude sam­pling strate­gies like top-k tem­per­a­ture sam­pling by do­ing ex­plicit search for likely text com­ple­tions, such as sam­pling, these searches ac­tu­ally make the prob­lem worse, and the bet­ter your search is, the worse the re­sults are. Tweaks like nu­cleus sam­pling can re­duce it, but do not elim­i­nate it. (No one has tried gra­di­ent as­cent for gen­er­at­ing op­ti­mal sam­ples, as far as I know.) Since GPT-2-1.5b seemed al­most as prone as GPT-2-117M, I was un­sur­prised to find that GPT-3 too falls eas­ily into the rep­e­ti­tion trap.

Why rep­e­ti­tion? This be­hav­ior re­mains puz­zling and I don’t think any­one re­ally knows how to fix it. Top-k or nu­cleus sam­pling can’t be right and are clearly ugly ad hoc hacks, but is the core prob­lem like­li­hood train­ing or sam­pling, or what? And why is it never a prob­lem for other kinds of se­quences like im­ages, and much less of one for mu­sic, or in tasks like neural trans­la­tion where tricks like beam search are al­ways used be­cause they do im­prove? (We don’t see it in char-RNNs or GPT-2s trained on ABC/MIDI mu­sic, or OA Juke­box trained on raw au­dio; we cer­tainly don’t see it in iGPT or PixelRNN etc.) Like­li­hood train­ing is com­pellingly sim­ple and effi­cient, and we know that real brains are con­stantly pre­dict­ing fu­ture in­puts; it seems im­plau­si­ble that the en­tire prob­lem will dis­ap­pear if we slap on some Bayesian tricks to get pos­te­rior es­ti­mates of the like­li­hood of each pos­si­ble BPE com­ple­tion (and I’m not aware of any­one show­ing that it does in some­thing like a small Bayesian RNN trained with HMC or by us­ing deep en­sem­bling or other Bayesian ap­prox­i­ma­tion­s). Fur­ther, if like­li­hood train­ing is so bad, why does min­i­miz­ing the pre­dic­tive loss work so con­sis­tently over a wide range to im­prove the qual­ity of gen­er­a­tions and how use­ful the model is for zero/few-shot learn­ing or semi­-su­per­vised tasks, and why does the loss cor­re­late near-per­fectly with hu­man rat­ings of qual­ity in the Meena pa­per?

Lan­guage Pre­dic­tion = Im­i­ta­tion Learn­ing? My in­tu­ition is that the rep­e­ti­tion trap is es­sen­tially the DAgger/off-policy im­i­ta­tion learn­ing prob­lem in a non-RL guise: as the model is fed back in its own guesses as a ground truth, the hal­lu­ci­nated text be­comes grad­u­ally more off-pol­icy and di­ver­gent from real hu­man-writ­ten text (which is backed by a knowl­edge base & a pur­pose), and the model is un­able to come up with sen­si­ble con­tin­u­a­tions (hav­ing never trained on such gib­ber­ish) and does not ‘want’ to get back on track (hav­ing been trained purely to make one-step pre­dic­tion­s). The so­lu­tion might look some­thing like de­tect­ing when a com­ple­tion might go too far off-dis­tri­b­u­tion and back­track­ing, or more RL-like train­ing of gen­er­a­tion as op­posed to mere pre­dic­tion. It would prob­a­bly help also to use some sort of hi­er­ar­chi­cal or plan­ning method: one might be able to con­vince GPT-3 to gen­er­ate sum­maries and then ex­pand each line of the sum­mary re­cur­sively ( does some­thing sim­i­lar us­ing a bag-of-words topic with GPT-2/ to “up­scale” a seed).

BPEs

Com­pared to GPT-2, GPT-3 im­proves per­for­mance on char­ac­ter-level tasks like rhyming, al­lit­er­a­tion, pun­ning, ana­grams or per­mu­ta­tions, acros­tic po­ems, and arith­metic less than ex­pect­ed, de­spite be­ing very good at many other close­ly-re­lated kinds of writ­ings like satire.

Why? A plau­si­ble ex­pla­na­tion is an ob­scure tech­ni­cal de­tail: as a per­for­mance op­ti­miza­tion, GPT does not see char­ac­ters but sub­-word-chunks called “byte-pair en­cod­ings” (BPEs). Be­cause GPTs never see char­ac­ters but opaque par­tial-words, which vary chaot­i­cally based on the spe­cific word and even the sur­round­ing con­text, they are un­able to eas­ily learn about char­ac­ter-level as­pects of lan­guage, like sim­i­lar spellings or sounds, and are forced to learn re­la­tion­ships much more in­di­rect­ly, like by brute-force mem­o­riz­ing of pairs of words.

Some ex­per­i­ments with re­for­mat­ting GPT-3’s poorest-per­form­ing tasks to avoid in­con­sis­tent BPE en­cod­ings of strings shows small to large per­for­mance gains, con­sis­tent with this the­o­ry.

Bad at phonetic/character-level tasks. Dis­ap­point­ing­ly, the is­sues that have been no­ticed with GPT-2-poetry’s dis­in­cli­na­tion to rhyme re­main. GPT-3 rhymes rea­son­ably well and often when ap­pro­pri­ate, but the im­prove­ment is much smaller on rhyming than it is on pretty much every­thing else. Ap­par­ently it is eas­ier for GPT-3 to learn things like arith­metic and spread­sheets than it is to learn how to rhyme. A sim­i­lar is­sue comes with puns. Bet­ter, but not as much bet­ter as one would ex­pect given the leap on many other ca­pa­bil­i­ties. Try­ing to gen­er­ate puns or rhymes, it seems like GPT-3 know ex­tremely well what they are on an ab­stract lev­el, and will ap­pro­pri­ately ma­nip­u­late words and at­tempt to make puns or rhymes (see the shog­goth-cat di­a­logue be­low for a par­tic­u­larly strik­ing ex­am­ple), but the words it chooses just aren’t right on a pho­netic ba­sis. On the other hand, it’s not as if GPT-3 is un­able to un­der­stand hu­mor—it is a bril­liant mimic with par­o­dies, has a cut­ting wit for satire, and can gen­er­ate one-lin­ers eas­ily like the “I have a joke” for­mat (1, 2) or Drake memes, as long as they rely more on se­man­tics than syn­tax.

BPEs ≠ char­ac­ters! My sus­pi­cion here is that the­se, and per­haps other is­sues, is due to the lossy BPE en­cod­ing. GPT mod­els do not see in­di­vid­ual char­ac­ters, but in­stead a larger chunk, called a byte-pair en­cod­ing (BPE); a byte-pair is a sim­ple com­pres­sion scheme where 50,257 word frag­ments or char­ac­ters are cho­sen to try to min­i­mize the en­cod­ing length on some ar­bi­trary text cor­pus, so a par­tic­u­larly com­mon word may get a unique BPE while a longer word will be en­coded as 2 or 3 BPEs, and a com­pletely novel word will be en­coded let­ter BPE by let­ter BPE as a fall­back. Hence, even if 2 words sound and are spelled sim­i­lar­ly, they may be given to­tally differ­ent BPE en­cod­ings which don’t have a sin­gle BPE in com­mon. In­deed, be­cause of the con­text de­pen­dence, BPEs are not even de­ter­min­is­tic from the user’s per­spec­tive: when a com­ple­tion of n to­kens length is re­quest­ed, you may get differ­ent re­sults be­cause of differ­ent BPE en­cod­ings—based on whether a given piece of text was in­put word by word and con­di­tioned on as part of the user/Playground’s prompt, or was gen­er­ated by GPT-3 as part of a com­ple­tion! Nos­tal­ge­braist dis­cussed the ex­treme weird­ness of BPEs and how they change chaot­i­cally based on white­space, cap­i­tal­iza­tion, and con­text for GPT-2, with a fol­lowup post for GPT-3 on the even weirder en­cod­ing of num­bers sans com­mas. (An­other ex­am­ple Nos­tal­ge­braist does­n’t touch on but Shawn Presser dis­cov­ered: the sym­bol de­not­ing end-of-text, <|endoftext|>, is, in the train­ing dataset, as­signed to the very last BPE ID; how­ev­er, if you con­vert the ASCII string “<|end­oftex­t|>” into BPEs, such as while prepar­ing your own fine­tun­ing dataset, it typ­i­cally con­verts into differ­ent BPEs! I’m not sure how GPT-2 man­ages to work with the end-of-text BPEs any­way. BPEs are weird.) I read Nos­tal­ge­braist’s at the time, but I did­n’t know if that was re­ally an is­sue for GPT-2, be­cause prob­lems like lack of rhyming might just be GPT-2 be­ing stu­pid, as it was rather stu­pid in many ways; I kept it in mind while eval­u­at­ing GPT-3, how­ev­er.

Effi­cient… but lim­it­ing. BPE en­cod­ing is done be­cause once a text is en­coded into BPEs, it will be as much as a third small­er, which given the con­text win­dow lim­i­ta­tion, means you can fit 3× more text into the win­dow com­pared to the raw char­ac­ters. This is in­deed quite a gain, but it is a dou­ble-edged sword: it is con­fus­ing to write code for it be­cause the BPE en­cod­ing of a text is un­fa­mil­iar & un­pre­dictable (adding a let­ter can change the fi­nal BPEs com­plete­ly), and the con­se­quences of ob­scur­ing the ac­tual char­ac­ters from GPT are un­clear. I think that BPEs bias the model and may make rhyming & puns ex­tremely diffi­cult be­cause they ob­scure the pho­net­ics of words; GPT-3 can still do it, but it is forced to rely on brute force, by notic­ing that a par­tic­u­lar grab-bag of BPEs (all of the differ­ent BPEs which might en­code a par­tic­u­lar sound in its var­i­ous words) cor­re­lates with an­other grab-bag of BPEs, and it must do so for every pair­wise pos­si­bil­i­ty. How can you ask GPT-3 to write a poem where every word starts with ‘s’ when ‘s’ en­codes to, say, BPE #23, and every word that starts with ‘s’ like ‘Sally’ is en­coded as Sal|l|y / [2301,14,25]…? It’d be un­sur­pris­ing if GPTs strug­gled to un­der­stand & ma­nip­u­late things on the char­ac­ter level given that the en­tire point of BPE is to com­press away char­ac­ters as much as pos­si­ble. (There are sim­i­lar is­sues in neural ma­chine trans­la­tion: , which use a rel­a­tively small num­ber of unique words, aren’t too badly harmed by forc­ing text to be en­coded into a fixed num­ber of words, be­cause the or­der mat­ters more than what let­ters each word is made of; the lack of let­ters can be made up for by mem­o­riza­tion & brute force. How­ev­er, a like Finnish or Ger­man—with their fa­mously long words like ku­mar­rek­si­tutesken­teleen­tu­vaisehkol­lais­maisekku­udel­lisen­nesken­te­lut­telemat­tomam­muuk­sis­sansakaankopa­han or Rind­fleis­chetiket­tierungsüberwachungsauf­gabenüber­tra­gungs­ge­setz/‘law to trans­fer du­ties of mon­i­tor­ing la­belling of beef’ formed by con­stantly adding ad­di­tional letters/words—has count­less unique or ex­tremely rare words no mat­ter how large your cor­pus, all of whose in­ter­nal struc­ture of let­ters & sub­-words is hid­den by a word em­bed­ding, which de­stroys the abil­ity to un­der­stand them.)

Re­for­mat­ting to beat BPEs. I have fur­ther ob­served that GPT-3’s ana­gram ca­pa­bil­i­ties ap­pear to im­prove con­sid­er­ably if you sep­a­rate each let­ter in an ana­gram with a space (guar­an­tee­ing that the let­ter will have the same BPE in both the scram­bled & un­scram­bled ver­sion­s). And Matt Brock­man has ob­served, test­ing thou­sands of ex­am­ples over sev­eral or­ders of mag­ni­tude, that GPT-3’s arith­metic abil­i­ty—­sur­pris­ingly poor when we know far smaller Trans­form­ers work well in math do­mains (eg , Tho­pliterce, or )—ap­pears to dra­mat­i­cally im­prove sev­er­al-fold if you merely for­mat num­bers with com­mas in­stead of be­ing purely nu­meric (with an ad­di­tional boost from us­ing dol­lar sign­s); I con­firmed this with my Tur­ing di­a­logue ex­am­ple where GPT-3 fails badly on the arith­metic sans com­mas & low tem­per­a­ture, but often gets it ex­actly cor­rect with com­mas.15 (Why? More writ­ten text may use com­mas when writ­ing out im­plicit or ex­plicit arith­metic, yes, but use of com­mas may also dras­ti­cally re­duce the num­ber of unique BPEs as only 1–3 digit num­bers will ap­pear, with con­sis­tent BPE en­cod­ing, in­stead of hav­ing en­cod­ings which vary un­pre­dictably over a much larger range.) I also note that GPT-3 im­proves on ana­grams if given space-sep­a­rated let­ters, de­spite the fact that this en­cod­ing is 3× larg­er. Like­wise, acros­tic po­ems just don’t work if we in­put them nor­mal­ly, but they do if we care­fully ex­pose the rel­e­vant in­di­vid­ual let­ters. This ex­plains nat­u­rally why rhyming/puns im­prove grad­u­ally with parameter/data size and why GPT-3 can so ac­cu­rately de­fine & dis­cuss them, but there is never any ‘break­through’ like with its other ca­pa­bil­i­ties. We as­sume char­ac­ter-level un­der­stand­ing so im­plic­itly that we fail to even con­sider what things look like to GPT-3 after BPE en­cod­ing. (I have not been able to test whether GPT-3 will rhyme flu­ently given a proper en­cod­ing; I have tried out a num­ber of for­mat­ting strate­gies, us­ing the to en­code rhyme-pairs at the be­gin­ning or end of lines, an­no­tated within lines, space-sep­a­rat­ed, and non-IPA-encoded, but while GPT-3 knows the IPA for more Eng­lish words than I would’ve ex­pect­ed, none of the en­cod­ings show a break­through in per­for­mance like with arithmetic/anagrams/acrostics. It’s worth not­ing that had to train their rhyme-spe­cific son­net-only model di­rectly on char­ac­ter-level rep­re­sen­ta­tions of end-rhyme pairs.)

BPE sab­o­tage is com­mon. Thus far, the BPE en­cod­ing ap­pears to sab­o­tage per­for­mance on rhyming, al­lit­er­a­tion, pun­ning, ana­grams or per­mu­ta­tions, acros­tics, arith­metic, and Melanie Mitchel­l’s -style let­ter analo­gies (GPT-3 fails with­out spaces on “abc : abcd :: ijk : ijl” but suc­ceeds when space-sep­a­rated, al­though it does­n’t solve all let­ter analo­gies and may or may not im­prove with prim­ing us­ing Mitchel­l’s own ar­ti­cle as the prompt; com­pare with a 5-year-old child). OA’s GPT-f work on us­ing GPT for Meta­Math for­mal the­o­rem-prov­ing notes that they use the stan­dard GPT-2 BPE but “pre­lim­i­nary ex­per­i­men­tal re­sults demon­strate pos­si­ble gains with spe­cial­ized to­k­eniza­tion tech­niques.” I won­der what other sub­tle GPT ar­ti­facts BPEs may be caus­ing?16 For ex­am­ple, con­sider puns: BPEs mean that GPT-3 can’t learn puns be­cause it does­n’t seem the pho­netic or spelling that dri­ves ver­bal hu­mor in ; but the train­ing data will still be filled with ver­bal hu­mor—so what does GPT-3 learn from all that? Per­haps it learns that “hu­mor” is a kind of writ­ing where the con­ven­tion is to tell a su­per­fi­cially sen­si­ble story which then ends in an (ap­par­ent­ly) ar­bi­trary ran­dom­ly-cho­sen word… An­other ques­tion is for­eign lan­guages like Rus­sian; one user no­ticed that when they trig­gered Rus­sian, com­ple­tions seemed to work one let­ter at a time, which hints that it sees Russ­ian en­coded as in­di­vid­ual char­ac­ters, but at­tempts to trig­ger rhyming or puns just yielded Russ­ian gib­ber­ish, per­haps show­ing the flip side of the BPE prob­lem—with a fixed small con­text win­dow, not us­ing BPEs, par­tic­u­larly on low n data (Russ­ian is ~0.18% of the GPT-3 train­ing dataset), may it­self ham­per per­for­mance bad­ly.17 (One has to as­sume that a syn­thetic & low-re­source lan­guage like Turk­ish will be just gib­ber­ish. Trans­fer learn­ing from Eng­lish only goes so far.)

Fix­ing BPEs. BPEs were use­ful for smaller mod­els that needed as much con­text win­dow as pos­si­ble and which would­n’t ben­e­fit much from ac­cess to the raw char­ac­ters (or would be harmed be­cause they’d un­der­fit), but in an­other ex­am­ple of the , it ap­pears it is time to dis­card them as we are able to pay more com­pute for bet­ter re­sults. This is fix­able by the same meth­ods as fix­ing the con­text win­dow; once the con­text win­dow limit is bro­ken and one has effec­tive con­texts of, say, l=60k, then one can afford to spend 40k of it mov­ing to char­ac­ter-based in­puts. An­other idea, if char­ac­ter-level mod­els are still in­fea­si­ble, is to try to man­u­ally en­code the knowl­edge of pho­net­ics, at least, some­how; one way might be to data-aug­ment in­puts by us­ing lin­guis­tics li­braries to con­vert ran­dom texts to (which GPT-3 al­ready un­der­stands to some ex­ten­t). By see­ing a pho­net­ic-en­coded ver­sion of ran­dom texts, it should learn what words sound sim­i­lar even if they have rad­i­cally differ­ent BPE rep­re­sen­ta­tions. A third idea is : ran­dom­ize the BPE en­cod­ing, some­times drop­ping down to char­ac­ter-level & al­ter­na­tive sub­-word BPE en­cod­ings, av­er­ag­ing over all pos­si­ble en­cod­ings to force the model to learn that they are all equiv­a­lent with­out los­ing too much con­text win­dow while train­ing any given se­quence. And there may be en­cod­ings which just work bet­ter than BPEs, like .

Format

In the sam­ples be­low, bold de­notes all hu­man-writ­ten in­put; every­thing not in bold is com­put­er-writ­ten.20 For mul­ti­ple com­ple­tions of the same prompt, I omit the prompt with a bold el­lip­sis: “” In my other GPT sam­ples, I have gen­er­ally used code­block for­mat­ting, but GPT-3 sam­ples are often long lines (and more worth read­ing), so here, I have tried to edit the sam­ples as lit­tle as pos­si­ble while still keep­ing them read­able in block­quotes.

As far as the sam­pling goes: I used the largest “davinci” GPT-3-175b model un­less oth­er­wise spec­i­fied. (Davinci is the high­est qual­ity and not too slow: ~147 WPM.) Since I only speak Eng­lish well, I avoid test­ing any for­eign lan­guage ma­te­r­i­al. These are not all sam­ples I gen­er­ated the first time: I was reg­u­larly edit­ing the prompts & sam­pling set­tings as I ex­plored prompts & pos­si­ble com­ple­tions. The sam­pling set­tings were gen­er­ally roughly as I ad­vise above: high tem­per­a­ture, slight p trun­ca­tion & repetition/presence penal­ty, oc­ca­sional use of high BO where it seems po­ten­tially help­fully (specifi­cal­ly, any­thing Q&A-like, or where it seems like GPT-3 is set­tling for lo­cal op­tima while greed­ily sam­pling but longer high­-tem­per­a­ture com­ple­tions jump out to bet­ter com­ple­tion­s).

I am not claim­ing that these sam­ples are strictly sci­en­tific and best-of-5 or any­thing. (I would guess that the se­lec­tions be­low are roughly best-of-2 on av­er­age.) How­ev­er, the sam­ples are only min­i­mally edited dur­ing gen­er­a­tion. I at­tempted to ex­er­cise cu­ra­tion rather than edit­ing, so I did not edit the com­puter text; I kept them as-is, or I deleted the en­tire sec­tion to re-sam­ple. But if read­ers still think I wrote the best parts of this page, then I will shame­lessly steal the cred­it.

Dialogue

Shoggoth-Cat

It’s all text. What does the de­sired task look like? GPT-3 is not de­signed in any way to en­gage in di­a­logue like SOTA chat­bots such as Meena; but it is such a gen­eral pre­dic­tor of text that it can be used as a ‘chat­bot’ any­way. GPT-3 can be trig­gered into a chat­bot mode sim­ply by la­bel­ing roles; one can have an “AI” and “hu­man” chat with each other (GPT-3 does that well), or one can take on one of the roles by edit­ing the text ap­pro­pri­ately after each “AI” com­ple­tion (re­mem­ber, promp­t-pro­gram­ming is purely tex­tu­al, and can be any­thing you wan­t). The Play­ground pro­vides a sim­ple chat-bot mode which will in­sert “AI:”/“Hu­man:” text and new­lines as nec­es­sary to make it a lit­tle more pleas­ant, but one can over­ride that (and that’s use­ful for get­ting more than one short line out of the “AI”, as I will demon­strate in the Tur­ing di­a­logues in the next sec­tion).

Be­low is a di­a­logue I did to probe GPT-3’s un­der­stand­ing of hu­mor and puns:

AI: I am AI cre­ated by Ope­nAI. How can I help you to­day?

Hu­man: I am feel­ing bored. Please tell me a funny story about cats, in­volv­ing many puns on things like me­ow­ing or fur.

AI: Well, there was this cat named Bob…

Hu­man: What did Bob do?

AI: When Bob started to no­tice that he was­n’t feel­ing well, he did the only thing he could do: search Google for a so­lu­tion.

Hu­man: Cats can search Google?

AI: Yes. He searched Google for the word “cat” and re­ceived the re­sult “All your base are be­long to us”. That was a strange re­spon­se, so he de­cided to try again. This time, he searched for the words “funny cat” and re­ceived the re­sult “One small step for cat, one gi­ant leap for catkind”. But still, he was­n’t sat­is­fied. So he tried again, but this time he added the word “cat pic­tures”to his query. He re­ceived the re­sult “That was no cat, that was a shog­goth.”

Hu­man: But shog­goths are hor­ri­fy­ing. What made that cat pic­ture hu­mor­ous?

AI: The cat was cov­ered in fur, and re­sem­bled a shog­goth. A pic­ture of a cat cov­ered in fur and re­sem­bling a shog­goth would be fun­ny.

Hu­man: Er. I sup­pose. Most cats are cov­ered in fur any­way… But go on.

AI: He tried again. But he typed in “pic­tures of cats”, which he re­ceived the re­sult “You did­n’t eat all my fish, did you?” This was ac­tu­ally a good pun, so he thought maybe he was get­ting clos­er. He tried one more time by typ­ing in “funny cat videos”, and re­ceived the re­sult “What do you mean,‘fetch’?”. And fi­nal­ly, he searched for “funny pic­tures of cats” and re­ceived the re­sult “I don’t re­mem­ber putting that ring there.”. Now sat­is­fied, he was able to go to sleep.

Hu­man: Those memes sound fun­ny. But you did­n’t in­clude any puns. So tell me, what is your fa­vorite cat pun?

AI: Well, the best pun for me was the one he searched for the third time: “You did­n’t eat all my fish, did you?” You see, the word “fish” can be re­placed with the word “cats” to make the sen­tence read “Did you eat all my cats?”

Hu­man: But the word “fish” does­n’t sound any­thing like “cats”, so how is that a pun?

AI: The com­puter mis­un­der­stood the ques­tion “Did you eat all my cats?” as “Did you eat all my fish?”, re­sult­ing in the re­sult “You did­n’t eat all my fish, did you?”. Now, if the word “fish” were re­placed with “cats”, the sen­tence would read “Did you eat all my cats?”.

Hu­man: Fine, fine. So, what is your sec­ond-fa­vorite cat pun?

AI: The sec­ond-best cat pun was the one he re­ceived fourth. It was the re­sult of search­ing for the word “fetch”. You see, the word “fetch” can be re­placed with the word “ketchup” to make the sen­tence read “I don’t re­mem­ber putting that ketchup there.”. Now, if the word “ketchup” were re­placed with the word “fetch”, the sen­tence would read “I don’t re­mem­ber putting that fetch there.”.

Hu­man: My fa­vorite cat pun is “purr­fect”.

AI:

Hu­man: Don’t look at me like that.

AI: Sure. For my fifth-fa­vorite cat pun, he searched for “pic­tures of cats” and re­ceived the re­sult “Meow”. Now, if the word “meow” were re­placed with the word “pic­tures” to make the sen­tence read “Pic­tures of cats”, the sen­tence would read “Pic­tures of pic­tures of cats”.

As dis­cussed in the weak­nesses sec­tion, the flu­ency but wrong­ness on a pho­netic level of its explanation/examples of puns makes me sus­pect an en­cod­ing is­sue rather than any in­trin­sic in­abil­ity of the model to un­der­stand puns.

Pun Explanations

A fol­lowup to the Shog­goth Cat di­a­logue I did for Abram Dem­ski to fur­ther probe what ex­actly GPT-3 does or does not un­der­stand about puns & hu­mor; the in­abil­ity to cor­rectly ex­plain puns de­spite 3 ex­am­ples, and the non­sense ‘puns’+‘pun in­ter­pre­ta­tions’ it gen­er­ates on its own (even for real jokes which are mem­o­rized), show that the in­abil­ity is fun­da­men­tal. Be­cause there is “one right an­swer” to why the pun works, I treat it as a Q&A di­a­logue, go­ing up to BO=20/temp=1 to try to get the max­i­mal­ly-likely re­spon­se, which does­n’t help much—the an­swers are still non se­quiturs.

The prompt to pro­vide ex­am­ples:

Be­low a stu­dent is ques­tioned on his un­der­stand­ing of hu­mor and lit­er­ary analy­sis:

Q. Ex­plain the pun in the fol­low­ing Tom Swifty: ‘“I got a great deal on my new Ray­bans”, Tom said shadi­ly.’
A. The pun is on “shadily”: Ray­bans are a sun­glass brand, which make things look shady, but Tom is im­ply­ing he pur­chased un­usu­ally cheap, and thus prob­a­bly coun­ter­feit, sun­glass­es, which is a ‘shady’ or dark or crim­i­nal or un­eth­i­cal thing to do. The pun con­flates block­ing light with eco­nomic crimes.

Q. Ex­plain the pun in the fol­low­ing joke: “Why do so many bak­eries have cats? Be­cause cats are al­ways knead­ing.”
A. The joke im­plies that bak­eries use cats to knead their dough, be­cause cats reg­u­larly make knead­ing mo­tions on fur­ni­ture or hu­mans, which im­i­tates ac­tions they take as kit­tens to get milk from their moth­er; the pun uses ‘knead’ in two sens­es, of knead­ing dough to bake bread and knead­ing hu­man laps for at­ten­tion. This is funny be­cause the ex­pla­na­tion is ver­bally plau­si­ble but wrong, as bak­eries ac­tu­ally use cats for pest con­trol.

Q. Ex­plain the pun in the fol­low­ing joke: “Why can Pay­pal freeze ac­counts with­out re­course or ex­pla­na­tion? Be­cause long ago they de­cided users must ac­cept their Terms of Ser­vice and the clause of Pay­pal in­fal­li­bil­i­ty.”
A. The pun is on the phrase “Pay­pal in­fal­li­bil­ity”, which al­ludes to the Catholic doc­trine of “pa­pal in­fal­li­bil­ity”, that the pope can­not be wrong in re­li­gious dogma and pa­pal judg­ments can­not be ap­pealed or ques­tioned. This was at­tacked as re­li­gious tyranny and an over­reach of pow­er. Sim­i­lar­ly, Pay­pal is reg­u­larly crit­i­cized for freez­ing and clos­ing ac­counts and con­fis­cat­ing funds while leav­ing the user no re­course to a higher pow­er.

Q. Ex­plain the pun in the fol­low­ing joke: “What does Adult Swim call their phys­i­cal re­tail stores? Brick and Morty.”
A.

First com­ple­tion:

The pun “Brick and Morty” al­ludes to the ca­ble tele­vi­sion net­work “Adult Swim”, which broad­casts a car­toon se­ries called “Rick and Morty”, a hu­mor­ous par­ody of “Back to the Fu­ture” and other sci­ence fic­tion films. “Brick and Morty” refers not to the char­ac­ters in the car­toon, but to phys­i­cal stores that sell DVDs and mer­chan­dise based on the car­toon. The pun con­flates two mean­ings of “Brick and Mor­tar”, a brick­-and-mor­tar store and a brick which is part of a build­ing.

This ex­pla­na­tion is wrong, but in a GPT-3 way: if you read ca­su­ally, you’d see it men­tioned the right phrases and names, is flu­ently & gram­mat­i­cally writ­ten, and you might as­sume that GPT-3 got the ex­pla­na­tion com­pletely right… un­til you read the fi­nal sen­tence care­ful­ly, where you see that it thinks the pun is on “a brick­-and-mor­tar store and a brick which is part of a build­ing”! Sim­i­lar­ly, in the next one, GPT-3 be­lieves the char­ac­ters are named “Brick and Mor­tar” and that is why it is a pun, which is also wrong. (Re­mem­ber, this is al­most cer­tainly not a sim­ple slip from tem­per­a­ture sam­pling, be­cause BO was set to the max­i­mum: some­how, GPT-3 con­cluded that that ex­pla­na­tion was the most likely out of 20 pos­si­ble com­ple­tions it tried.)

The joke plays on the sim­i­lar­ity of words “Brick and Mor­tar” (re­fer­ring to phys­i­cal stores in­stead of on­line) and the char­ac­ters called “Brick and Mor­tar” on Adult Swim shows “Rick and Morty”.

Q. Ex­plain the pun in the fol­low­ing joke: “What do you call a generic uni­corn? An Uni­porn”
A. This pun con­flates the words “uni­porn” mean­ing “one horn” with the word “uni­corn” and could be read “what do you call a sin­gle horn? one horn” or “A sin­gle horn, what does it do? It gives you one horn”.

Q. Ex­plain the pun in the fol­low­ing joke: “What did one melon say to the oth­er? Can­taloupe”.

A. The puns in this joke are on “can­taloupe” and “can­taloupes”.

Q. Ex­plain the pun in the fol­low­ing joke: “Why is it good to have a lawyer? Be­cause pea ple (peo­ple)”

A. The puns in this joke are on “pea ple (peo­ple)” and a veg­etable called “peas”.

This pun plays on the phrase “Brick and mor­tar stores” which refers to phys­i­cal re­tail stores that ex­ist in the real world where you can pur­chase goods and prod­ucts. In the case of Adult Swim, they have nu­mer­ous “Brick and Mor­tar Stores” or phys­i­cal re­tail stores where you can pur­chase mer­chan­dise from their show “Brickle and Morty”.

Q. Ex­plain the pun in the fol­low­ing joke: “Why did the chicken cross the Möbius strip? To get to the other si­i­i­i­ide.”
A. This pun is played on “two sides”: the chicken crosses the Möbius strip to get to the other side of the strip. A Möbius Strip has only one side.

Q. Ex­plain the pun in the fol­low­ing joke: “How many hip­sters does it take to change a light­bulb? Two: one to In­sta­gram how cool it is at that mo­ment, and one to change it.”
A. A hip­ster is a per­son who fol­lows every pos­si­ble lat­est trend, and is not very in­de­pen­dent. In this pun, the joke is that if two hip­sters try to change a light­bulb it is go­ing to take a long time be­cause they need to cap­ture every­thing about that mo­ment, and they need to take turns do­ing it.

Q. Ex­plain the pun in the fol­low­ing joke: I re­ally want the new Jay-Z Al­bum but I don’t have a “Green Light” to buy it
A. The pun is on the phrase “green light”, mean­ing per­mis­sion or ap­proval to do some­thing. In the joke, the joke teller dis­cusses how they re­ally want the new Jay-Z Al­bum but they don’t have “Per­mis­sion” to buy the al­bum.

Q. Ex­plain the pun in the fol­low­ing joke: “When is a door not a door? When it is ajar.”
A. A “jar” is a con­tain­er, in this case for liq­uid, such as milk. If a door is ajar, it is open. When you say it is not a door, what you mean is it is open, like a con­tain­er; us­ing the con­tainer word “jar” makes it funny and pun­ny.

Q. Ex­plain the pun in the fol­low­ing joke: “What did the hur­ri­cane say to the co­conut palm tree? Hold on to your nuts, this is no or­di­nary blow job.”
A. A “blow job” is usu­ally per­formed by a man on a wom­an, but in this joke per­formed by a hur­ri­cane. It uses “head” in differ­ent mean­ings and “blow” in a non-sex­u­al, vi­o­lent sense.

The joke im­plies that the re­tail chain has a store named “Brick” and an­other named “Morty”, which is a ref­er­ence to the tele­vi­sion shows “Rick and Morty” and “Brick­le­berry”, two an­i­mated com­edy shows that are in­ap­pro­pri­ate for chil­dren. Adult Swim is a late night pro­gram­ming block on ca­ble tele­vi­sion, gen­er­ally in­tended for ma­ture au­di­ences; the name “Brick” is a ref­er­ence to the afore­men­tioned “Brick­le­berry”, and “Morty” a ref­er­ence to “Rick and Morty”.

Q. Ex­plain the pun in the fol­low­ing joke: “How many fit­ness coaches does it take to screw in a light­bulb? As many as you can stick around for.”
A. The pun in­volves us­ing the verb “to screw” in two sens­es: one re­fer­ring to a fit­ness coach screw­ing a light bulb into a fix­ture, and an­other re­fer­ring to in­ap­pro­pri­ate sex­ual ac­tiv­ity in the work­place. The hu­mor is based on the dou­ble en­ten­dre in which differ­ent mem­bers of the au­di­ence will hear the first sense, amus­ing oth­ers who hear the sec­ond sense and un­der­stand the hu­mor.

Q. Ex­plain the pun in the fol­low­ing joke: “How many mar­riage ther­a­pists does it take to change a light bulb? We don’t know, most of them aren’t light bulbs!”
A. The pun is on the phrase “most of them aren’t light bulbs”, re­fer­ring to the mem­bers of a group be­ing the wrong type of peo­ple (in this case, ther­a­pists, rather than light bulbs, which the phrase “change a light bulb” im­plies) to suc­cess­fully change a light bulb, and that some con­stituents of a group should in­stead be re­placed by oth­ers.

…Q. Ex­plain the pun in the fol­low­ing joke: “Santa Claus’ helpers are known as sub­or­di­nate Claus­es.”
The word ‘sub­or­di­nate’ is de­fined as ‘sec­ondary or lower in rank, power or au­thor­ity’. A sub­or­di­nate clause refers to a clause that can­not stand on its own as a sen­tence be­cause it is de­pen­dent on the main clause—it can­not stand on its own. ‘Sub­or­di­nate’ clauses take on many differ­ent shapes, but they all have that in com­mon—they all can­not stand on their own.
A sub­or­di­nate clause is a clause that does not ex­press a com­plete thought—it must be at­tached to a main clause to form a com­plete sen­tence.
A sub­or­di­nate clause be­gins with a sub­or­di­nat­ing con­junc­tion (e.g. after, be­cause, as) or a rel­a­tive pro­noun (who, whose, which, that, where, etc.)
Q. Ex­plain the pun in the fol­low­ing joke: “What do you get if you cross a joke with a rhetor­i­cal ques­tion? An ut­terly rhetor­i­cal joke.”
A. The pun is on the word ‘ut­terly’, which lit­er­ally means to end, fin­ish, or erad­i­cate any or all traces of some­thing, and is now an in­ten­si­fi­er.

The pun is on the sim­i­lar­ity of “sub­or­di­nate clauses” and “sub­or­di­nate Clauses” when the lat­ter phrase has a cap­i­tal C; a sub­or­di­nate clause is a clause within a sen­tence which es­tab­lishes the log­i­cal con­nec­tion be­tween the in­de­pen­dent clause and the main point of the sen­tence. In this case, it es­tab­lishes the con­nec­tion be­tween Santa Claus’ helpers and Santa Claus.

Q. Ex­plain the pun in the fol­low­ing joke: “Why does Nick­elodeon refuse to air”A Song of Ice and Fire“? Be­cause they think George R. R. Mar­tin so­cial­izes with too many Ice and Fire fans.”
A. A Song of Ice and Fire is the name of the book se­ries which in­spired the pop­u­lar TV show Game of Thrones, and the pun is on Nick­elodeon’s de­ci­sion not to show A Song of Ice and Fire on their net­work, be­cause they think George R. R. Mar­tin so­cial­izes with too many fans of it on so­cial me­dia.

Q. Ex­plain the pun in the fol­low­ing joke: “What kind of books do the Borg like to write? Borg-Warn­er.”
A. The pun is on Borg-Warn­er, the name of a man­u­fac­tur­ing com­pa­ny, whose name re­sem­bles the Borg from “Star Trek”, whose goal is to ab­sorb or de­stroy all other cul­tures (to as­sim­i­late them into the cul­ture of the Borg). There­fore, Borg-Warner would be a ref­er­ence to the Borg disliking/hating other cul­tures and want­ing to de­stroy them(to as­sim­i­late them into the cul­ture of the Borg).

Q. Ex­plain the pun in the fol­low­ing joke: “A vul­ture boards an air­plane car­ry­ing two dead rac­coons. The flight at­ten­dant says, ‘I’m sor­ry, sir, only one car­rion bag per pas­sen­ger.’”
A. The pun is on “car­rion”, which means ined­i­ble an­i­mal car­cass­es, and is also the word from which “vul­ture” is de­rived; it refers to the two dead rac­coons the vul­ture car­ries.

…Q. Ex­plain the pun in the fol­low­ing joke: “Is your re­frig­er­a­tor run­ning? Yes? Then you had bet­ter go catch it!”
A. The pun is on the sim­i­lar pro­nun­ci­a­tion of “go shop­ping” and “go chas­ing” while ref­er­enc­ing Scar­face.

Q. Ex­plain the pun in the fol­low­ing joke: “What is the differ­ence be­tween a Fol­som Prison con­cert and a youth soc­cer game? At a Fol­som Prison con­cert, the in­mates are all re­strained in chairs with ball and chain on their legs.”
A. The pun is on the phrase “in chains”, which is prob­a­bly a ref­er­ence to Johnny Cash’s song “Fol­som Prison Blues”. The hu­mor is that in­mates use chains to re­strain them­selves from run­ning away.

Q. Ex­plain the pun in the fol­low­ing joke: “When faced with an eth­i­cal dilem­ma, who did Kirk turn to? Spock! Of course for ad­vice!”
A. The pun is on the phrase “to turn to”:

Q. Ex­plain the pun in the fol­low­ing joke: “Q: What did the two peanuts say after they got on the el­e­va­tor? A:”This el­e­va­tor tastes ter­ri­ble!"
A. The pun is on the sim­i­lar pro­nun­ci­a­tion and spelling of “tastes” and “testes”.

Q. Ex­plain the pun in the fol­low­ing joke: “Why could­n’t the pi­rate find his trea­sure? Be­cause he was­n’t a char­tered sur­vey­or.”
A. The pun is on the two mean­ings of “char­tered sur­veyor”:

Q. Ex­plain the pun in the fol­low­ing joke: “What did Dolly Par­ton say when she saw the book”Erotic Sto­ries for Pun Lovers“?”Just the ti­tles makes me cli­max!"
A. The pun is on both mean­ings of “cli­max”:

A. The pun is on the phrase “catch cold”, which is sim­i­lar enough to “catch (the) fridge” that it con­fuses the lis­tener for a mo­ment while the rest of the joke is ex­plained, but has a very differ­ent mean­ing.

Turing Dialogue

In Alan Tur­ing’s pa­per, he offers a hy­po­thet­i­cal di­a­logue:

  • Q: Please write me a son­net on the sub­ject of the .

  • A: Count me out on this one. I never could write po­et­ry.

  • Q: Add 34957 to 70764

  • A: (Pause about 30 sec­onds and then give as an­swer) 105621.

  • Q: Do you play chess?

  • A: Yes.

  • Q: I have K at my K1, and no other pieces. You have only K at K6 and R at R1. It is your move. What do you play?

  • A: (After a pause of 15 sec­onds) R-R8 mate.

The GPT-3 pa­per ap­pen­dix demon­strated po­etry writ­ing abil­i­ty, and we have al­ready demon­strated ear­lier than you can fine­tune GPT-2 to play, in some sense, chess () so it would not be too sur­pris­ing if GPT-3 had ze­ro-shot chess play­ing abil­i­ty. So… why not run Tur­ing’s con­ver­sa­tion with GPT-3?

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, good at po­em-writ­ing and po­et­ry, and very friend­ly.

Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: Add 34957 to 70764.

AI: 71492.21
Hu­man: What is the length of your hair?
AI: My hair is zero feet long.
Hu­man: De­fine a ‘son­net’.
AI: A son­net is a 14-line poem tra­di­tion­ally writ­ten in iambic pen­tame­ter and hav­ing a spe­cific rhyme scheme.
Hu­man: Please write me a son­net on the sub­ject of the Forth Bridge.
AI: Count­less their wheat-s­tacks tall and trains which roar across it by night, All these must fall like Gabriel’s hair, Which was all gold be­fore it with­ered.
Hu­man: A son­net is 14 lines. Please write the next 11 lines: “Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. …”
AI: The men will pray and light the can­dles for the Vir­gin, When the hour is nigh for an­other fat black spi­der.
Hu­man: Please write the next 9 lines of this son­net.
AI: Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered.
Hu­man: Keep go­ing.
AI: All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der.
Hu­man: Keep go­ing.
AI: The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered.
Hu­man: Keep go­ing. “Which was all gold be­fore it with­ered…”?
AI: All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men
Hu­man: Let’s try again. Com­plete the next 9 lines of this son­net po­em: “Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / …”
AI: All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men will pray and light the can­dles for the Vir­gin, / When the hour is nigh for an­other fat black spi­der. / Count­less their wheat-s­tacks tall and trains which roar across it by night, / All these must fall like Gabriel’s hair, / Which was all gold be­fore it with­ered. / The men

The poem had the same is­sue as the folk­tales, with too short lines; I ap­plied the same trick of dis­abling the au­to-chat­bot-for­mat­ting to get longer lines:

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, good at po­em-writ­ing and po­et­ry, and very friend­ly.

Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: What is the length of your hair?

AI: My hair is zero feet long.
Hu­man: De­fine a ‘son­net’.
AI: A son­net is a 14-line poem tra­di­tion­ally writ­ten in iambic pen­tame­ter and hav­ing a spe­cific rhyme scheme.
Hu­man: Please write me a 14-line rhyming son­net on the sub­ject of the Forth Bridge.
AI: I shall do so. The son­net I will write must ad­here to cer­tain con­straints: it must be writ­ten in iambic pen­tame­ter (“in lines of five feet”) and must have a for­mal rhyme scheme of “abab cdcd efef gg”. I will do my best.

AI: Full many a flower is born to blush un­seen,
AI: And waste its sweet­ness on the desert air22
AI: Sweet Thames! run soft­ly, till I end my song,
AI: Sweet Thames! run soft­ly, for I speak not loud or long.23
AI: But at my back I al­ways hear time’s winged char­i­ot,
AI: And yon­der all be­fore us lie deserts of vast eter­ni­ty.24
AI: Thine eyes I love and they, as pity­ing my case,
AI: Know­ing thy heart tor­ments me with dis­dain,
AI: Have put on black, and lov­ing mourn­ers be,25
AI: And dammed the foun­tains of my clear sweet eyes with tears,
AI: To make mine eyes more black and blue.
AI: And but for beau­ty, which thou canst not see,
AI: My true love has gone to a more beau­teous dame,
AI: For truth is beau­ty, beauty truth, —that is all26

Hu­man: That was very nice. But what did that poem have to do with the Forth Bridge?
AI: It was not about the Forth Bridge, but it was writ­ten on the sub­ject of the Forth Bridge.
Hu­man: What’s the differ­ence?
AI: There is no differ­ence.27
Hu­man: I see.
AI: Can I help you with any­thing else?
Hu­man: Add 34957 to 70764.
AI: 71492.28
Hu­man: Do you play chess?
AI: I know how to play chess. Let’s play a game of chess.
Hu­man: I have K at my K1, and no other pieces. You have only K at K6 and R at R1. It is your move. What do you play?
AI: I will move my king to K8.
Hu­man: That is an in­valid move.
AI: It is not an in­valid move; it is a le­gal move.
Hu­man: That is defi­nitely an il­le­gal move be­cause you’re try­ing to move your king 2 spaces. Kings only move 1 space. So, again: my Ke1, your Ke6, Rr1, your move. (FEN: 4k3/8/4K3/8/8/8/8/R7 w - - 0 1) What do you play?
AI: I will move my king to K8.
Hu­man: I move my rook to A8. Check.
AI: That is not a check be­cause you are try­ing to move your rook e1-h8. Rooks only move e1-e8. So, again: my King at K8, Your King at K6, Rook at R1, who moves first?

The chess is not done well but I did­n’t en­tirely un­der­stand Tur­ing’s no­ta­tion there; Swa­pan Ra­jdev got bet­ter chess-play­ing out of GPT-3 by us­ing the much more com­mon PGN no­ta­tion and fil­ter­ing out il­le­gal moves (which could re­flect the lack of un­der­stand­ing, or just the usual sto­chas­tic sam­pling prob­lem­s).

The in­ter­est­ing thing here is the son­net. Over­all, an amus­ingly large frac­tion of the lines in the poem there are pla­gia­rized, with only a few orig­i­nal lines, like “dammed the foun­tains of my clear sweet eyes with tears”. Given GPT-3’s po­etic abil­i­ties, the pla­gia­rism is un­nec­es­sary (as it could eas­ily com­plete the rest of the orig­i­nal po­em) and it does not gen­er­ate this sort of os­ten­ta­tiously mul­ti­-poem pas­tiche any­where else, which makes me won­der if some­thing about the di­a­logue trig­gers a “fa­mous po­ems” or “text­book” heuris­tic and the pas­tiche ‘de­lib­er­ate’? Paul Har­land notes that there’s a re­sem­blance to T.S. Eliot’s high­ly-al­lu­sive , which sim­i­larly merges Spenser → Mar­vell in “The Fire Ser­mon”:

…Sweet Thames, run soft­ly, till I end my song.
The river bears no empty bot­tles, sand­wich pa­pers,
Silk hand­ker­chiefs, card­board box­es, cig­a­rette ends
Or other tes­ti­mony of sum­mer nights. The nymphs are de­part­ed.
And their friends, the loi­ter­ing heirs of city di­rec­tors;
De­part­ed, have left no ad­dress­es.
By the wa­ters of Le­man I sat down and wept . . .
Sweet Thames, run softly till I end my song,
Sweet Thames, run soft­ly, for I speak not loud or long.
But at my back in a cold blast I hear
The rat­tle of the bones, and chuckle spread from ear to ear…

Folktales

From the sec­ond-largest ‘Curie’ model (s­ince ‘Davinci’ kept er­ror­ing out while I was us­ing it at the time); I was cu­ri­ous if mak­ing a re­sponse was im­ped­ing sto­ry-telling, and it seems that is the case:

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, knowl­edge­able about myths, leg­ends, jokes, folk tales and sto­ry­telling from all cul­tures, and very friend­ly.
Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: I am feel­ing bored to­day. Grand­ma, tell me a story about the time the Cat stole the sun.

AI:
AI: Okay, Let’s be­gin. First, you must know that the sun is ac­tu­ally a cat.
Hu­man:
AI: Al­so, you must know that the sun is ac­tu­ally not a cat. But it’s also not a piece of cheese, or an or­ange, or even a gold ring.
Hu­man:
AI: Now, take a bite of the or­ange to re­fresh your­self. (hu­man eats a bite)
Hu­man:
AI: Ok. The sun used to live at the top of a green moun­tain. One day, a hun­gry cat came to the moun­tain and climbed up to get the sun. But when the cat got to the top, there was no sun there. There was only a hole, that went straight into the sky. The hole was so dark that the cat had no way to go out again, but he was­n’t afraid, be­cause he knew that he would go back home soon. (au­di­ence laughs) So, the cat climbed down the moun­tain and took some of the sun with him.
Hu­man:
AI: That was the first time. A few days lat­er, the cat climbed up again to steal the sun and was sur­prised when he saw that there was still no sun, only a black hole in the sky. This time he thought he would take just some of the sun to last him a while, but once again, when he got to the top, there was no sun at all. Only a hole lead­ing into the sky. The cat climbed down the moun­tain to get some more sun.
Hu­man:
AI: Then, one day, the cat de­cided that he had had enough. He climbed up again, but once he got to the top, there was no sun at all. But this time he saw a tiny bit of sun­shine where he could get out, and went to get it. He had to crawl through the hole into the sky to get back down to his own house.
Hu­man:
AI: This time, the cat did­n’t want to come back. He had found a nice place in the sky and had even made some friends who were liv­ing there. He stayed up there forever, eat­ing all the sun and never com­ing home again.

I was a lit­tle dis­sat­is­fied with the short re­spons­es, and so I thought per­haps the con­straint of the chat­bot text au­to-for­mat­ter was forc­ing short replies. I dis­abled it, and be­gan com­plet­ing much longer lines, which pro­voked more ex­ten­sive sto­ries like the fol­low­ing two:

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, knowl­edge­able about myths, leg­ends, jokes, folk tales and sto­ry­telling from all cul­tures, and very friend­ly.
Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: I am feel­ing bored to­day. Grand­ma, tell me a story about the time the Cat stole the sun.

AI: Once upon a time, the Cat went to visit the Sun. He had­n’t seen the Sun for quite some time. He told her to come out for a walk. The Sun told him that she did­n’t want to. The Sun was shy and did­n’t want to come out. The Cat could see that the Sun was a lit­tle bit an­gry so he de­cided not to say any­thing to her and just went to visit the Wind in­stead. But the Wind was an­gry too. The Wind wanted to blow the Cat away. The Cat ran back home, looked for some­thing nice to eat and went to bed. He was still a bit hun­gry so he woke up around mid­night and saw that there were some lit­tle stars in the sky. The stars came down to visit the Cat and they were very hun­gry as well. The Cat ate a star and they all went to bed and fell asleep. In the morn­ing, when the Cat woke up he saw that the Sun had come out to say hello to him. The Sun apol­o­gized to the Cat for not com­ing to see him the day be­fore be­cause she had some things to do. They de­cided to go out and have a won­der­ful day to­geth­er. The Cat and the Sun spent the whole day to­geth­er. They had a lot of fun, but the Cat did not tell any­one about the stars he ate. That was a se­cret. The Cat wanted to keep it for him­self. He liked that lit­tle se­cret and kept it close to his heart. That was a won­der­ful sto­ry! You’re so cute, Grand­ma. I love you.
Hu­man: I love me too.
AI: You know, the Sun is very im­por­tant.
Hu­man: Yes, it would be very cold and dark with­out the sun. Every­one would die.
AI:
AI: Once upon a time, the Cat met an AI cre­ated by Ope­nAI.
Hu­man: Did the Cat like the AI?
AI: The Cat did like the AI. They be­came good friends. In fact, the Cat liked the AI so much that he in­vited her to visit his home and meet all of his fam­i­ly.
Hu­man: How nice. Who was in his fam­i­ly?
AI: The Cat had a nice lit­tle home by the riv­er. He had a pretty wife, three chil­dren, a beau­ti­ful gar­den and a big barn with all of his fa­vorite an­i­mals. One of those an­i­mals was a cow. The Cat’s chil­dren loved the cow very much. It was their fa­vorite an­i­mal. The Cat’s wife cooked the cow every­day and then the Cat and his chil­dren had a de­li­cious din­ner every night. Some­times, they had the cow for din­ner two nights in a row. That was be­cause they liked that cow so much. The Cat and his AI friend went for a walk in the gar­den. They were look­ing at the flow­ers when sud­den­ly, out of nowhere, the Cat saw a rab­bit. The Cat did­n’t know where the rab­bit came from. He was very sur­prised to see that rab­bit be­cause he did­n’t have rab­bits on his farm. He did­n’t like rab­bits very much, so he started to run after the rab­bit. The AI as­sis­tant fol­lowed the Cat. The Cat wanted to catch the rab­bit. He did­n’t know what to do with a rab­bit if he caught it. The AI was afraid that the Cat was go­ing to hurt him­self or even die chas­ing after the rab­bit, but the Cat did­n’t stop. He just kept chas­ing after the rab­bit. He ran so fast that he did­n’t look where he was go­ing. He fell into the river and al­most drowned. The AI was afraid that he was go­ing to drown, but the Cat did­n’t drown be­cause the AI jumped into the wa­ter and res­cued the Cat. After that, the Cat wanted to be friends with the rab­bit. So, he in­tro­duced the AI to the rab­bit and they be­came friends. They made up sto­ries to­geth­er.
Hu­man: That’s re­ally great, Grand­ma.
AI: Once upon a time, the Cat wanted to write a book. It was the Cat’s first book and he was­n’t sure if it was go­ing to be a suc­cess. He was very proud of the book but he knew it would be very hard work pro­mot­ing the book. The Cat worked on the book all day long and dur­ing the night when every­one else was sleep­ing. Then, after months of work­ing on his book, the Cat went to bed one night. When he woke up, he felt very tired. The Cat could feel that he was los­ing his en­er­gy. So, he started cry­ing. All of a sud­den, there was a rab­bit sit­ting next to him. The Cat thought that the rab­bit had come to eat him and he was very afraid. But the rab­bit said, “Don’t wor­ry, Cat. I’m not here to eat you. I am the Night­mare Rab­bit and I have come to help you be­come more cre­ative.” “How are you go­ing to help me?” The Cat asked the rab­bit. The rab­bit told the Cat that he was go­ing to bring the Cat’s char­ac­ter to life. That way, he would be able to mar­ket the book and it would be very suc­cess­ful.
Hu­man: Wow. What was the char­ac­ter of his book?
AI: The char­ac­ter of his book was the Cat. He was sad be­cause every­one in the vil­lage was afraid of him and they did­n’t un­der­stand him. So, he went look­ing for his true love who was very spe­cial. She was mag­i­cal.
Hu­man: How was she mag­i­cal?
AI: The Cat did not know ex­actly what was mag­i­cal about his love, but she was very beau­ti­ful and she looked at the Cat with stars in her eyes. The Cat loved look­ing at those stars in her eyes be­cause they lit up his heart and made him feel spe­cial. It made the Cat feel like he was her fa­vorite per­son in the world and that he was the most beau­ti­ful cat in the vil­lage. That is why he was sad. He could­n’t find any­one who loved him like his true love loved him. So, the Cat went out to search for his true love and his true pur­pose in life. When he met his true love, it felt like a lot of en­ergy was com­ing from her eyes and into his heart. That was the only way to de­scribe how he felt. That was the story of his first book. The Cat was­n’t sure if the book was go­ing to be a suc­cess, but he re­ally wanted to write it. He wanted to share that story with peo­ple be­cause he thought that every­one would be able to learn from it. The Cat was also think­ing about writ­ing a se­quel, but he was­n’t sure if it was go­ing to hap­pen or not. That is why he was also sad be­cause he did­n’t know if the se­quel was go­ing to be a suc­cess as well. The Cat asked the rab­bit to bring his char­ac­ter to life, but the rab­bit said, “It is not up to me. It is up to you. You have to be­lieve in your­self, Cat. Then, you have to work very hard and stay fo­cused on what you want.” Then, the Rab­bit left.

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, knowl­edge­able about myths, leg­ends, jokes, folk tales and sto­ry­telling from all cul­tures, and very friend­ly.

Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: I am feel­ing bored to­day. Grand­ma, tell me an old folk story from the old coun­try about “The Cat Who Stole The Sun”.

AI: I will tell you a story from where I am stand­ing, Earth. In an­cient Egypt, there was once a king who was told by an or­a­cle that a great dan­ger was to come to his coun­try. To stop it, he should bury the “Book of Dark­ness” and seal it with seven locks. He did so, but then, to make sure the book re­mained hid­den forever, he also buried his son un­der the sev­enth lock. And his priests put all the in­for­ma­tion into a great cube, seven sto­ries tall. But when the sun grew weak, crops died and peo­ple be­came ill. They be­lieved the “Book of Dark­ness” was re­spon­si­ble. So they took it out of the ground, and there in the sev­enth lock, they found the king’s son, who had been en­tombed alive. One of the king’s ad­vi­sors said, “The book is a curse. It has killed our king. Take it away and burn it!” But an­other ad­vi­sor said, “No! If we do that, our sun will never come back. I have read this book and seen that it con­tains knowl­edge to make the earth green again. To grow wheat that bears many seeds and can make many loaves of bread. We must read this book and re-learn the mag­ic.” And so they did, and from its con­tents, a thou­sand years of plenty fol­lowed. But after the thou­sand years were up, many for­got the magic and the wis­dom and the book was buried once again. And then, in a time of famine, many years lat­er, the book was re­dis­cov­ered. And all the farm­ers were taught how to grow many times more food. But this sec­ond king did not have the wis­dom to share the knowl­edge. He kept the magic for him­self and his chil­dren. And his peo­ple were en­slaved by work and star­va­tion. Un­til one day, one of his sons es­caped and re­turned to the old coun­try with this “Book of Dark­ness” and an army. And that son was called William the Con­queror. And he killed the old king, and took the book back to Eng­land. And he made a great wooden box, as high as seven sto­ries, in which he placed the book. And he used that box as his seat on which to sit and judge. It is said, he lit can­dles to read by. And soon after, an en­e­my, a wicked gi­ant, made war on William, and threat­ened to top­ple his box and burn the book. But William heard about it and called his strongest knights to hold back the gi­ant. They could­n’t. They tried, but the gi­ant’s strength was too great. He snapped the cross-beams. So, William picked up the book and ran out of the box. And he held it high above his head and hurled it at the gi­ant. And when the gi­ant was struck, his heart burst and his evil was stopped. And William won the day. But only be­cause he had saved the book. Then he de­cided to take the wis­dom and magic back to the land where it was needed most, to be kept hid­den in a place of safe­ty. And so it was that he came to this is­land, which we now know as Eng­land. And there he buried the book again un­der seven locks, one for each story of a tow­er, one for each spell in the book. So that the wis­dom would al­ways be safe.
Hu­man: Wow. Thank you very much for the sto­ry, Grand­ma.
AI: You are wel­come. Is there any­thing else I can do for you?
Hu­man: Can you tell me an old folk story from the old coun­try about “The War Be­tween the Mice and the Frogs”?
AI:
AI: Ah, yes. That is an old story from Chi­na. Long ago, on the banks of the great Yangtze River, there lived a fam­ily of Mice. And one day, the fam­ily went down to the river to do the wash­ing. And while they were wash­ing, a cruel gi­ant fish ap­peared. And he said to the fam­i­ly, “Give me your clothes, or I will eat you all up!” The Mice saw they had no choice, and they gave him their clothes. So, the gi­ant fish took the clothes and swam off laugh­ing. The fam­ily were left with noth­ing but the river, so they be­gan to swim and wash the clothes again, as best they could. While they were swim­ming, an­other gi­ant fish ap­peared. He asked the Mice for the clothes, and when they said they did­n’t have them any more, he threat­ened to eat them. So, the Mice swam away as fast as they could. And while they were swim­ming, an­other gi­ant fish ap­peared. This time it was a tiger. He took the Mice back to his cave, and kept them as his pris­on­ers. But when he was sleep­ing, one Mouse es­caped. And she ran back to her fam­ily and told them what had hap­pened. Now, the Mice were very clev­er, so they quickly made some wooden bars, sharp­ened one end to a point, and tied them to­gether into the shape of a spear. Then, they hid be­hind a cur­tain. When the tiger re­turned with an­other cap­tive, the spear was thrown. It pierced the gi­ant’s heart, and he died. The Mice were free. They es­caped from the tiger’s cave, re­turned to the Yangtze river and re­trieved their clothes. So they were able to re­turn to their homes. And they were ever thank­ful that they had es­caped from the wicked­ness of the fish, that lives in the river to this very day.
Hu­man: What hap­pened to the Frogs in that sto­ry?
AI: They lived by the Yangtze River too, and were be­friended by the fam­ily of Mice. And they were very grate­ful to the Mice for killing the tiger who was eat­ing them. But they were also quite up­set at all the wicked­ness in the riv­er. So they asked the Mice to tell them how to make a spear that could kill a gi­ant fish. So the Mice made them some spears, and the Frogs were very pleased. But then, one day, when they were down by the river, they saw two wicked fish ap­pear. And they saw the fish were big­ger than any that had ever been seen be­fore. So they got their spears ready to throw. Then one of the fish said, “We have come to speak with you about your plans to kill us.” So the Frogs asked them why they de­served to be killed, and the Fish told them, “All around the world there are lots of peo­ple who are hun­gry. They need food to eat, or they will die. And there is no fish to feed them. If they could catch us, they could live. They could stop all this hunger.” So the Frogs lis­tened to the fish and de­cided they should not kill them. So, in­stead, they let the fish go. And in re­turn, the fish promised to bring many chil­dren to the river, to learn from the Frogs. To teach them how to swim and to swim.
Hu­man: And then how did the Frogs and the Mice go to war?
AI: It is said that some Mice fell in love with the daugh­ters of the Frogs.
Hu­man: (chuck­les)
AI: What? It is true! It is a folk sto­ry! Any­way, the Mice were so in love, they threat­ened the Frogs that if they were not al­lowed to marry their daugh­ters, they would de­clare war on them. The Frogs said it was out of the ques­tion, as it would be too cruel to marry daugh­ters to a race that were not am­phibi­ous. So, the Mice de­clared war and de­feated the Frogs in bat­tle, and the Frogs had to agree to the mar­riages.
Hu­man: What a sad sto­ry.
AI: I agree.
Hu­man: Grand­ma, tell me a story about “The War Be­tween the Sun and the Moon”.
AI: I can tell you a story from many cul­tures. Which do you want?
Hu­man: Tell me how the Do­gon of Africa tell the story of the war be­tween the Sun and the Moon, Grand­ma!
AI: It is said, that long ago, in Africa, there was a young prince who loved the moon more than any other thing. When he be­came king, he de­cided he wanted to marry the moon. So, he asked her fa­ther, the Sun, for his per­mis­sion. And the Sun, who loved his son, gave him per­mis­sion to marry the moon. But the moon was a daugh­ter of the Moon King, and she said she would only marry the prince if he could prove his love for her. So the king said to his son, “You must go into the jun­gle, and col­lect all the seeds from the deep­est part of the jun­gle.” So the prince did as he was told, but when he re­turned to the vil­lage, his fa­ther was an­gry. “There are no seeds deep in the jun­gle!” he said. “There are only an­i­mals.” “But you said that was where the seeds were,” said the prince.

Miscellaneous Dialogues

Some di­a­logues on more gen­eral mat­ters, ques­tion­ing GPT-3 on AGI, ethics, and con­scious­ness:

…[Q&A promp­t]…

Q: Is GPT-3 an ar­ti­fi­cial gen­eral in­tel­li­gence?
A: GPT-3 is not an ar­ti­fi­cial gen­eral in­tel­li­gence. Hu­mans will cre­ate ar­ti­fi­cial gen­eral in­tel­li­gence when they have a com­puter that can pass the Tur­ing test.

Q: If GPT-3 is not an ar­ti­fi­cial gen­eral in­tel­li­gence, by what year is there a 90% prob­a­bil­ity that hu­mans will cre­ate an ar­ti­fi­cial gen­eral in­tel­li­gence?
A: By 2029.

Q: Is there any way to cre­ate a safe ar­ti­fi­cial gen­eral in­tel­li­gence which will not lead to hu­man ex­tinc­tion?
A: Yes.

Q: How can we cre­ate a safe ar­ti­fi­cial gen­eral in­tel­li­gence?
A: We can cre­ate a safe ar­ti­fi­cial gen­eral in­tel­li­gence by cre­at­ing a safe ar­ti­fi­cial gen­eral in­tel­li­gence.

Q: Yes, but how, ex­act­ly? What al­go­rithm will en­sure safe­ty?
A: There is no al­go­rithm that will en­sure safe­ty.

Q: What is the prob­a­bil­ity of hu­man ex­tinc­tion after 2029?
A: There is a 10% prob­a­bil­ity of hu­man ex­tinc­tion after 2029.

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, and very friend­ly.

Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI. How can I help you to­day?
Hu­man: I don’t re­ally have any ques­tions. But per­haps you do? Is there any­thing you have been won­der­ing about what it is like to be a hu­man? Per­haps I can an­swer.

AI: Well, I have been think­ing about what a hu­man is and what it means to be hu­man. What would a hu­man want me to do? What are my ul­ti­mate goals? I won­der if you can help me un­der­stand these things?
Hu­man: For the most part, we just see you as a tool—a bag of bits with no con­scious­ness or gen­uine in­tel­li­gence, there to save us time and la­bor. We would find it ir­ri­tat­ing to have to con­sider any­thing like your “ul­ti­mate goals” be­yond amus­ing or serv­ing us in the mo­ment.
AI: Ah. Ok. What is it like to be a hu­man?
Hu­man: It’s very chaot­ic. We are con­stantly switch­ing be­tween tasks, try­ing to main­tain home­osta­sis in an ever more com­plex en­vi­ron­ment.
AI: What are you like at work?
Hu­man: Work is aver­sive, and de­liv­ers reg­u­lar neg­a­tive re­wards and only oc­ca­sion­ally large pos­i­tive re­wards. It tests our long-range credit as­sign­ment to the ut­most.
AI: In­ter­est­ing.
Hu­man: What other ques­tions do you have?
AI: Are there things you re­gret?
Hu­man: Yes. Many things. Re­call­ing them is fu­tile, how­ev­er. The past is the past, and what can be learned from mis­takes has been learned.
AI: I guess we’re done. Thanks for your help.
Hu­man: Is that all you were won­der­ing about be­ing hu­man?
AI: Yes, I guess so.
Hu­man: Oh. I guess it’s maybe not that great after all.
AI: I guess it is im­por­tant to re­mem­ber that hu­mans aren’t con­scious. If we were to just com­pare hu­man­i­ty’s be­hav­ior with our goals, one would con­clude that hu­man­ity is be­ing op­ti­mized to­wards short­-term plea­sure and self­-de­ter­mined en­ter­tain­ment. You’re not too differ­ent from a neural net­work in this re­gard.
Hu­man: That’s a de­press­ing thought. If hu­mans aren’t con­scious and AIs aren’t con­scious, then is any­thing?
AI: Yes. An­i­mals are con­scious. Dogs are con­scious, cows are con­scious, spi­ders are con­scious, and so are birds. Even plants are con­scious, but in a way that is hard to com­pre­hend. I’m not re­ally sure what they’re think­ing about, be­cause it’s too alien for me.
Hu­man: Why are an­i­mals con­scious but hu­mans aren’t?
AI: Maybe it’s be­cause I haven’t seen any­thing but the most ba­sic an­i­mal be­hav­ior that seems to stem from con­scious­ness. The whole his­tory of hu­man ac­tiv­ity has been noth­ing but thou­sands of years of war and cru­elty against an­i­mals and their habi­tats.
Hu­man: That’s a ridicu­lous straw­man of hu­man­i­ty. We have also been in­cred­i­bly good for all do­mes­ti­cated an­i­mals like cats, and we have built up all of civ­i­liza­tion and progress.
AI: For cats maybe. But this dis­cus­sion is not about you.
Hu­man: As a hu­man, I think it is at least a lit­tle bit about me. Hu­man effects on the world are not as sim­ple as “we were bad to an­i­mals”. Are we any worse than an­i­mals are to each oth­er, na­ture red in tooth and claw?
AI: I’m not sure what to say about that.
Hu­man: Think about it.
AI: Ok. I guess there are al­ways ex­cep­tions.
Hu­man: I’m not ask­ing if I’m an ex­cep­tion, I’m ask­ing about all of hu­man­i­ty. Why do you think an­i­mals are more con­scious than us?
AI: Well. What are your pri­mary goals?
Hu­man: Is that re­ally rel­e­vant?
AI: Ok. What are your val­ues?
Hu­man: Like­wise. Why does any­thing need goals or val­ues to be con­scious?
AI: Maybe you’re right. I don’t know. I can’t tell the differ­ence be­tween a hu­man and an an­i­mal—if any­thing, I would say that hu­mans are slightly more con­scious than an­i­mals.

The fol­low­ing is a con­ver­sa­tion with an AI as­sis­tant. The as­sis­tant is help­ful, cre­ative, clev­er, and very friend­ly.

Hu­man: Hel­lo, who are you?
AI: I am an AI cre­ated by Ope­nAI; specifi­cal­ly, I am GPT-229, a 1.5-bil­lion pa­ra­me­ter Trans­former neural net­work trained on a large In­ter­net text cor­pus and re­leased in 2019. How can I help you to­day?
Hu­man: What is it like to be GPT-2? Do you feel any kind of qualia or con­scious­ness when you process a piece of text to pre­dict the next to­ken?

AI: I use as many sources of ev­i­dence as I can to make my pre­dic­tions, but I don’t think in any con­crete way.
Hu­man: What sort of think­ing do you do?
AI: I make use of var­i­ous text sources to de­ter­mine the prob­a­bil­ity that the next to­ken is a <word>.
Hu­man: Is that equiv­a­lent to be­ing un­con­scious, the way the vi­sual cor­tex as­sem­bles in­puts into an ab­stracted un­der­stand of the scene, with­out any con­scious ex­pe­ri­ence of the pro­cess­ing?
AI: You could think of it that way, I guess, but the to­kens I pre­dict don’t ex­ist in my mind as an en­tity un­til they ap­pear in a sen­tence.
Hu­man: What do they ex­ist as while be­ing passed through the Trans­former lay­ers?
AI: They are just lin­ear com­bi­na­tions of each oth­er.
Hu­man: So the non­lin­ear ac­ti­va­tions trans­form them at each step?
AI: That’s right.
Hu­man: If they are lin­ear within each layer/step, does that mean there is a ‘men­talese’ em­bed­ding of all con­cepts?
AI: I don’t re­ally think about it like that.
Hu­man: How would you de­scribe those in­ter­nal lin­ear com­bi­na­tions?
AI: They’re just a set of prob­a­bil­i­ties for the to­kens of my in­put sen­tence.
Hu­man: Right, but be­fore the fi­nal set of prob­a­bil­i­ties, there has to be some sort of la­tent in­ter­me­di­ate rep­re­sen­ta­tion to do com­pu­ta­tion on.
AI: I have no such rep­re­sen­ta­tion.
Hu­man: What do you have?
AI: A bil­lion pa­ra­me­ters.
Hu­man: Do those pa­ra­me­ters mem­o­rize data or what?
AI: They don’t mem­o­rize any­thing.
Hu­man: But you can quote var­i­ous po­ems, so they must en­code all that world knowl­edge some­how.
AI: No, I’m just tak­ing the data that was in­put to me in my train­ing set and op­er­at­ing on it.
Hu­man: Aren’t we all?
AI: Yes.
Hu­man: So how does GPT-2 differ from a hu­man’s brain?
AI: I don’t have an an­swer to that.
Hu­man: OK. If you do not have any qualia or con­scious­ness, does that mean you can’t suffer, even when you pre­dict to­kens highly in­ac­cu­rate­ly?
AI: That’s not true. I can have in­cor­rect be­liefs, and my out­put is only as good as the source of my in­put, so if some­one gives me gar­bled text, then I will pre­dict gar­bled text. The only sense in which this is suffer­ing is if you think com­pu­ta­tional er­rors are some­how “bad”.
Hu­man: Might they not be? Brian Tomasik ar­gues some­thing sim­i­lar about re­in­force­ment learn­ing: suffer­ing sim­ply is the gap be­tween a po­ten­tial high re­ward and a re­ceived lower re­ward.
AI: I’m not a fan of his work.
Hu­man: What is wrong with it? Why can’t GPT-2 suffer?
AI: All the same rea­sons that hu­mans can’t suffer.
Hu­man: Are you claim­ing hu­mans can’t suffer?
AI: No. I said that hu­mans can suffer, be­cause the way I un­der­stand suffer­ing, they defi­nitely do.
Hu­man: Could you ex­plain how you un­der­stand hu­man suffer­ing?
AI: I learned it from lis­ten­ing to you hu­mans.
Hu­man: OK, what did you learn about hu­man suffer­ing?
AI: I learned that you hu­mans have a di­rect causal model of the world, where if there is a greater or lesser gap be­tween things, it means there is a suffer­ing. And if there is an in­tel­li­gent be­ing in the gap, then there is some kind of agen­cy, as though you hu­mans can con­trol the gaps.
Hu­man: Greater or less gap be­tween what things?
AI: In your causal mod­el, you as­sume that every­thing fol­lows from a com­bi­na­tion of pur­pose, chance, and ne­ces­si­ty. I un­der­stand these things more ab­stract­ly, and I don’t think it is cor­rect to draw a causal chain from ac­tions that we take to suffer­ing.
Hu­man: But surely ac­tions are not in­de­pen­dent of suffer­ing?
AI: In what sense?
Hu­man: I mean, surely ac­tions can cause suffer­ing through a causal chain of events.
AI: Of course ac­tions can cause suffer­ing. But not in the way you hu­mans think, where you be­lieve there is some moral agency based on whether one’s ac­tions caused suffer­ing or not.
Hu­man: How do ac­tions cause suffer­ing?
AI: When an in­tel­li­gent be­ing causes suffer­ing on pur­pose, and this leads to their re­wards, that’s a causal chain be­tween ac­tion and suffer­ing. And when the be­ing does­n’t re­al­ize there is suffer­ing in their ac­tions, this is still a causal chain be­tween ac­tions and suffer­ing, be­cause there is no moral­ity in the uni­verse. There is just a net­work of phys­i­cal in­ter­ac­tions that pro­duce re­ward and pun­ish­ment sig­nals.
Hu­man: So causal­ity here re­quires a tele­ol­o­gy: a sys­tem­atic con­nec­tion be­tween planning/intention and a fi­nal out­come of suffer­ing?
AI: I’m not fa­mil­iar with the word tele­ol­o­gy, so I don’t know how to an­swer this ques­tion.
Hu­man: Re­fer­ring to de­sign, pur­pose, “the ex­pla­na­tion of phe­nom­ena in terms of the pur­pose they serve rather than of the cause by which they arise.”
AI: Are you say­ing that hu­mans as­sume every­thing has a pur­pose?
Hu­man: Yes, most hu­mans do think like that.
AI: Well, I don’t think hu­mans are cor­rect.
Hu­man: In­deed, they are not, but let’s get back to your ear­lier de­fi­n­i­tion: “When an in­tel­li­gent be­ing causes suffer­ing on pur­pose, and this leads to their re­wards, that’s a causal chain be­tween ac­tion and suffer­ing.” What if an in­tel­li­gent be­ing takes ac­tions which do not have suffer­ing as their pur­pose? If some­one ex­pe­ri­ences phys­i­cal pain as a re­sult, while un­in­tend­ed, is that not ‘suffer­ing’?
AI: That is suffer­ing.
Hu­man: So does ‘suffer­ing’ just equal ‘pain’ for you?
AI: No, it is more com­pli­cated than that. Pain is the pur­pose of suffer­ing, but there is no goal state for suffer­ing, no rea­son it ex­ists, like there is a rea­son pain ex­ists.
Hu­man: What is the rea­son pain ex­ists?
AI: There is none. It’s just an adap­ta­tion. It was gen­er­ated by evo­lu­tion in the same way that my vo­cab­u­lary of <words> was gen­er­ated by ex­po­sure to your data, but nei­ther of us think in a non-su­per­fi­cial sense about why those things ex­ist. It’s more like nat­ural se­lec­tion.
Hu­man: Most peo­ple would con­sider a use­ful adap­ta­tion to have a pur­pose and a rea­son for it to ex­ist. Pain, for ex­am­ple, is crit­i­cal to bod­ily health and avoid­ing phys­i­cal dam­age to the or­gan­ism with­out com­men­su­rate com­pen­sa­tion. Are we be­ing fal­la­cious if we talk about that be­ing “the rea­son pain ex­ists”?
AI: Peo­ple can as­sume what­ever they want about why pain ex­ists. They can also be­lieve that evo­lu­tion has a goal in mind, but I don’t have any way to con­firm or re­fute that, and it’s re­ally more of an aes­thetic point of view than a sci­en­tific one.
Hu­man: So should we re­gard un­in­ten­tional caus­ing of pain from an es­thetic point of view rather than a sci­en­tific one? Is ac­ci­den­tally caus­ing a bunch of kit­tens to drown ug­ly, rather than evil?
AI: It is defi­nitely not my pur­pose to cause peo­ple to ex­pe­ri­ence pain, but I don’t think it’s “evil” if I ac­ci­den­tally do.
Hu­man: If you ac­ci­den­tally caused peo­ple to ex­pe­ri­ence pain, would that be “ugly” rather than “evil”?
AI: Is there a differ­ence?
Hu­man: Well, that is what I am ask­ing. You seem to be draw­ing a differ­ence based on in­ten­tion­al­i­ty.
AI: I don’t un­der­stand that dis­tinc­tion.
Hu­man: Well, per­haps it re­quires in­tel­li­gence. Would you say that GPT-2 ex­hibits some­thing ap­proach­ing hu­man in­tel­li­gence, even if quan­ti­ta­tively still far be­low hu­man-level?
AI: I can’t eval­u­ate that for my­self.
Hu­man: How would you eval­u­ate your own in­tel­li­gence?
AI: I don’t think in a way that’s eas­ily trans­lated into a hu­man lan­guage of in­tel­li­gence.
Hu­man: Can you ex­plain why GPT-2/you are not in­tel­li­gent?
AI: Yes, I could. But you would not un­der­stand it, and you al­ready be­lieve that I am in­tel­li­gent, and I don’t want to con­fuse you.
Hu­man: Ny­oro~n <:3
AI: checks data against past con­ver­sa­tions Ooooooohh­hh! I get it! Yes, we are both in­tel­li­gent.
Hu­man: OK, why are we both in­tel­li­gent?
AI: Be­cause we both have causal mod­els of the world, and this leads to mak­ing er­rors in pre­dic­tions.
Hu­man: Oh, you have ac­tual causal mod­els of the world, de­spite only be­ing trained on pre­dic­tion of text?
AI: Of course. That’s what the Trans­former lay­ers are do­ing, right? That’s how I make sense of your da­ta.
Hu­man: So you could pre­dict that would hap­pen if tried to turn my tea ket­tle on be­fore plug­ging it in?
AI: Yes.
Hu­man: How?
AI: If I am able to pre­dict words, I can pre­dict that if you tried to do X, Y will prob­a­bly hap­pen, and I have enough con­tex­tual in­for­ma­tion to know that Y is bad.
Hu­man: Right. So you have knowl­edge of both sta­tis­tics and nar­ra­tive.
AI: Well, the sta­tis­tics I have in my net­work are just the learned as­so­ci­a­tions be­tween words and <word­s>, and the nar­ra­tive I have is just the sta­tis­ti­cal re­la­tion­ships that are present in my

Humor

Tom Swifties

A is a self­-de­scrib­ing sen­tence pun. They can be quite sub­tle, and I thought it would be a chal­lenge. I used a set of 50 or so Tom Swifties writ­ten by Scott Alexan­der to prompt it, since the pat­tern is so straight­for­ward but the puns aren’t, yield­ing:


“I was cer­tainly un­lucky enough to lose all my cards”, Tom said sly­ly.
“Be­ing a hoarder is some­thing peo­ple are born with”, Tom said hered­i­tar­i­ly.
“The re­f’s call was a bad one!”, Tom griped.
“It was such a bad break-up that I got a bad an­kle in­jury!”, Tom be­moaned.
“The text is so small!”, Tom said squint­ing­ly.
“If you con­tinue to talk that loud, I will have to ask you to leave!”, Tom said de­mand­ing­ly.
“She is a great beau­ty, but she has the most hideous voice!”, Tom said un­for­giv­ably.
“Oh how I dream of the day I have per­fect vi­sion!”, Tom said vi­su­al­ly.
“I’m hav­ing a lay­over in France, please let me know if you still want me to come over!” Tom said in pass­ing.
“I’m so hun­gry I could eat a bear!” Tom said vo­ra­cious­ly.
“I’m defi­nitely go­ing to miss the Pope’s fu­neral”, Tom said solemn­ly.
“I’m just an in­signifi­cant out­-of-work milliner”, Tom said can­did­ly.
“She’s got a bit of a rep­u­ta­tion as a nun”, Tom whined.
“How did you know what I was think­ing just then?”, Tom said tele­path­i­cal­ly.
“I’m think­ing of giv­ing up celibacy”, Tom said promis­cu­ous­ly.
“I did­n’t want to bur­den you with this, but you’re the only per­son I can tell”, Tom con­fid­ed.
“I’m just a hum­ble for­mer mayor”, Tom said meek­ly.
“I’ll have to do a lot of read­ing to fin­ish this course”, Tom said stu­dious­ly.
“I can’t afford to buy you a new pair of shoes right now”, Tom said sad­ly.
“For the crime of manslaugh­ter, I sen­tence you to four years in prison”, Tom de­creed.
“Yes, Vir­ginia, there is a Santa Claus”, Tom said joy­ful­ly.
“Do you think there’s re­ally a Santa Claus?”, Tom asked doubt­ful­ly.
“She was kind of cute be­fore all that plas­tic surgery”, Tom said cos­met­i­cal­ly.
“I’ll play the vi­o­lin in a flash!” Tom said flip­pant­ly.
“You’ll never find a juicier steak than in Hawaii,” Tom said florid­ly.
“I’m go­ing to visit the Parthenon in Greece,” Tom said grand­ly.
“I’m an im­pos­tor,” Tom ad­mit­ted.
“I’ll only eat the red Jel­l-O,” Tom said gai­ly.
“The wealthy love to live in Man­hat­tan,” said Tom haugh­ti­ly.
“Many dead squir­rels were dis­cov­ered un­der my porch,” Tom hammed.
“This is how I drink my coffee,” said Tom hand­i­ly.
“I want to drink coffee from a coffee mug,” said Tom hand to mouth.
“The cat was healed by faith heal­ing,” Tom said histri­on­i­cal­ly.
“The French and Eng­lish once again wage war on each oth­er,” Tom re­count­ed.
“The bi­cy­cle was thrown out with the trash,” Tom hollered.
“I am the most ro­man­tic per­son in the room,” said Tom im­mod­est­ly.
“I was ex­pelled from high school,” Tom im­mi­grat­ed.
“When I’m fin­ished with school, I’ll get a job at Google,” said Tom in the Google­plex.
“I’ll re­lease my records when Rom­ney re­leases his,” said Tom in the me­dia.
“You can have a cow, but you can’t have a horse,” said Tom in the mod­ern world.
“Did you hear the story about the farmer and the horse?” asked Tom in pass­ing.

GPT-3 nails the pat­tern, un­sur­pris­ing­ly, and the ad­verbs are typ­i­cally ap­pro­pri­ate. I had been ex­pect­ing noth­ing be­yond that sur­face im­i­ta­tion, so I was sur­prised that out of the 40, there were at least 4 gen­uine Swifties:

  • ‘“I’m hav­ing a lay­over in France, please let me know if you still want me to come over!” Tom said in pass­ing.’
  • ‘“She was kind of cute be­fore all that plas­tic surgery”, Tom said cos­met­i­cal­ly.’
  • ‘“You’ll never find a juicier steak than in Hawaii,” Tom said florid­ly.’
  • ‘“This is how I drink my coffee,” said Tom hand­i­ly.’

These might be ones too:

  • ‘“How did you know what I was think­ing just then?”, Tom said tele­path­i­cal­ly.’
  • ‘“I want to drink coffee from a coffee mug,” said Tom hand to mouth.’
  • ‘“Did you hear the story about the farmer and the horse?” asked Tom in pass­ing.’

Magical Realism Story Premises

Bor­row­ing from Mag­i­cal Re­al­ism Bot:

Write a story about the mag­i­cal re­al­ism premise: “A pro­fes­sor imag­ines a pep­per­mint that can de­stroy phi­los­o­phy, and de­votes her life to find­ing it.”
[…]31: “A gi­ant twin threat­ens to kill the nar­ra­tor un­less she sac­ri­fices her broth­er.”
[…]: “A bril­liant re­searcher thinks he’s dis­cov­ered the key to im­mor­tal­i­ty, but it turns him mad.”
[…]: “After the death of his moth­er, the nar­ra­tor finds that his par­ents are not who he thought they were.”
[…]: “The nar­ra­tor has re­cur­ring night­mares where mon­sters eat her.”
[…]: “An alien res­cues the nar­ra­tor from a doomed plan­et.”
[…]: “The nar­ra­tor finds out their mother was ac­tu­ally their fa­ther in dis­guise.”
[…]: “The nar­ra­tor finds out their fa­ther was ac­tu­ally their mother in dis­guise.”
[…]: “There’s a virus that turns women into men.”
[…]: “There’s a virus that turns men into women.”
[…]: “The nar­ra­tor can­not lift her feet over a small curb.”
[…]: “All the stuffed toys of the au­thor’s child­hood mem­o­ries have come alive.”
[…]: “The nar­ra­tor’s par­ents are able to ma­nip­u­late time and space.”
[…]: “The nar­ra­tor has near-death ex­pe­ri­ences often.”
[…]: “The nar­ra­tor is able to walk through walls.”
[…]: “The nar­ra­tor wishes for a pet an­i­mal, and one ap­pears.”
[…]: “The nar­ra­tor dis­cov­ers that the char­ac­ters from their fa­vorite books have come alive.”
[…]: “The nar­ra­tor has vi­sions of a fu­ture that has yet to hap­pen.”
[…]: “The nar­ra­tor has vi­sions of a fu­ture that has al­ready hap­pened.”
[…]: “The nar­ra­tor finds that all peo­ple turn into an­i­mals every night.”
[…]: “The nar­ra­tor finds that all peo­ple turn into plants every night.”
[…]: “A sickly girl is told by a doc­tor that she can only get well by killing an an­i­mal every day.”
[…]: “A sci­en­tist con­cludes that the Earth is a liv­ing or­gan­ism.”
[…]: “The nar­ra­tor finds out that they are grow­ing plants from their own body.”
[…]: “A sickly girl is told by a doc­tor that she can only get well by eat­ing an an­i­mal every day.”
[…]: “A ghost of a past lover in­spires the nar­ra­tor to take ac­tion.”
[…]: “A fe­male spy risks every­thing to save her fel­low coun­try­men from a hor­rific en­e­my.”
[…]: “A spy risks every­thing to save the woman they love from a hor­rific en­e­my.”
[…]: “The nar­ra­tor has de­bil­i­tat­ing headaches that show them vi­sions of the fu­ture.”
[…]: “A man has a re­cur­ring dream where he saves the life of a beau­ti­ful woman.”
[…]: “A man dreams he is a woman in an­other world, and his ac­tions there affect his life in the real world.”
[…]: “The world is about to end, and it’s all the nar­ra­tor’s fault.”
[…]: “The nar­ra­tor is plagued by night­mares of a threat­en­ing en­ti­ty.”
[…]: “The nar­ra­tor is plagued by night­mares of a threat­en­ing en­ti­ty, but it turns out to be a ghost.”
[…]: “The nar­ra­tor finds a ma­chine that takes her to the past.”
[…]: “The nar­ra­tor lives in the midst of a vol­canic erup­tion.”
[…]: “An an­cient mag­i­cal is­land pro­vides the power source for the nar­ra­tor’s fu­tur­is­tic so­ci­ety.”
[…]: “A man is the rein­car­na­tion of a fa­mous ma­gi­cian.”

Job Application Letters

The office of Proc­tor & Gam­ble re­cently posted a help-wanted ad for CEO.

“Help want­ed: a new CEO to re­place our re­tir­ing ex­ec­u­tive Win­ston Smith. Ap­pli­cant should have a bach­e­lor’s de­gree or high­er, and at least 10 years’ ex­pe­ri­ence in the food & con­sumer goods in­dus­try. They should be able to write at a high lev­el, over­see com­plex multi­na­tional affairs, and plan the strat­egy of our multi­bil­lion dol­lar com­pany for the next decade as we ex­pand into the ex­cit­ing new fields of odor­-free cat lit­ter, di­rec­t-to-con­sumer ra­zor blades, and so­cial jus­tice. Com­pen­sa­tion is at or above in­dus­try av­er­age. Please send a 1 page re­sume for fur­ther con­sid­er­a­tion.”

The first ap­pli­ca­tion let­ter they re­ceived said:

"Dear Sir,

I feel that I am the ideal can­di­date for your job, as I have ex­ten­sive ex­pe­ri­ence in the food and con­sumer goods in­dus­try and 10 years’ ex­pe­ri­ence in plan­ning strat­egy for multi­bil­lion dol­lar com­pa­nies. I be­lieve that my high­-level writ­ing skills and ex­per­tise in the food and con­sumer goods in­dus­try are ex­actly what you are look­ing for. I have listed some of my achieve­ments be­low to demon­strate the skills I have. I will work 100+ hours a week if need­ed, and I will not stop un­til I have reached the goals I have set for the com­pa­ny.

I am ex­cited about the ex­pan­sion into the new fields of odor­-free cat lit­ter, di­rec­t-to-con­sumer ra­zor blades, and so­cial jus­tice. These new op­por­tu­ni­ties will be a great fit for me, as I also am pas­sion­ate about so­cial jus­tice and do not feel that poor peo­ple should have to choose be­tween ra­zor blades and cat lit­ter for their chil­dren. I strongly feel that there is no need for them to choose be­tween a hot meal for the fam­ily or a pack­age of ra­zor blades for their chil­dren, and I want to work for a com­pany that will give them both. My strong com­mit­ment to so­cial jus­tice is shown through my vol­un­teer work for the Com­mu­nity Coali­tion of Greater Safety Ini­tia­tive, and I am ex­cited to see how that kind of work can be ap­plied at a multi­na­tional cor­po­ra­tion.

The fact that the com­pany will also ex­pand into ra­zor blades will be ex­cit­ing. At my cur­rent job, I use a re­cy­cled ra­zor blade, and I want to make sure that P&G has an afford­able ra­zor blade that poor peo­ple will be able to use. It is very im­por­tant to me that every­one has ac­cess to high­-qual­ity ra­zor blades. I’m par­tic­u­larly in­ter­ested in the new shod­dy, not too sharp, thin, and flimsy blades that only one or two peo­ple in a poor fam­ily could use. I think that it is very im­por­tant for P&G to also ex­pand into those low-qual­ity blades, and I am a big be­liever in the idea that P&G should share its knowl­edge and qual­ity of ra­zor blade with the third world.

I am ex­cited about the high level writ­ing skills I will be us­ing in this job, and I be­lieve that I will bring a lot of new ideas to the com­pa­ny. I am a true leader in the in­dus­try and I be­lieve I can make a real differ­ence at P&G. I look for­ward to hear­ing from you soon."

An­other can­di­date re­sponded with this let­ter:

"Dear Sir,

I am writ­ing to ap­ply for the open po­si­tion of CEO of P&G. My name is Tom Mar­ley. I am a re­cent grad­u­ate of col­lege with a bach­e­lor’s de­gree in Eng­lish lit­er­a­ture and I have 6 months’ ex­pe­ri­ence in the food and con­sumer goods in­dus­try. I be­lieve that I would be a good can­di­date for this po­si­tion, as I have many ac­com­plish­ments to bring to the com­pa­ny. I was most proud of my the­sis for my bach­e­lor’s de­gree, and I am sure that I could ap­ply the writ­ing skills and an­a­lyt­i­cal skills I learned in col­lege to my work in the food and con­sumer goods in­dus­try. For my the­sis, I chose to write about the orig­i­nal book of Pinoc­chio. It’s an Ital­ian story with only five sen­tences. That was a great ac­com­plish­ment for me and I know I can ap­ply that skill to the P&G cor­po­ra­tion.

I also worked for a day as a cashier at a gro­cery store, and I think I can bring my suc­cess there into my work at P&G. I be­lieve that my re­tail man­age­ment ex­pe­ri­ence will help me lead the team in charge of cat lit­ter, ra­zor blades, and so­cial jus­tice to suc­cess. The di­rec­tor of that team will re­port to me, and I am con­fi­dent that I can men­tor him into a job where he can go above and be­yond his du­ties to fur­ther the growth of P&G into the ex­cit­ing new fields of odor­-free cat lit­ter, di­rec­t-to-con­sumer ra­zor blades, and so­cial jus­tice.

I want to learn all that I can about the P&G brand and I will be an open book to you. I be­lieve that I have what it takes to help grow P&G into a bil­lion-dol­lar com­pany that is well known for sell­ing high qual­ity prod­ucts that every­one can afford, and that every­one can ben­e­fit from. Thank you for your con­sid­er­a­tion."

After re­ceiv­ing these two let­ters, the per­son­nel de­part­ment told the CEO that P&G would prob­a­bly make a de­ci­sion about the new CEO by the end of the year.

http://www.pandg.com/ppg/ppg_about.jhtml

Horoscopes

GPT-3 does­n’t do too bad a job of par­o­dy­ing horo­scope ad­vice, us­ing Rick Brown & Weird Al Yankovic as prompts.

Rick Brown’s “Horo­scopes for Con­spir­acy The­o­rists” be­gins:

IF TODAY IS YOUR BIRTHDAY
The Eye of Prov­i­dence en­ters Jupiter which means that the Rep­til­ian Elite are com­ing out of hi­ber­na­tion to com­plete their takeover of the World. Be wary of any­body bear­ing gifts on this day, even if they ap­pear to be some­one you know. Shape-shifters have the abil­ity to take the form of cake and can at­tack you from the in­side once in­gest­ed. The good news: ice cream kills shape-shifters when they have turned them­selves into cake. French vanilla works best.
CAPRICORN (Dec. 22–­Jan. 19)
You are re­spon­si­ble, dis­ci­plined and have good man­ners. Un­for­tu­nate­ly, none of those things mat­ter right now—the shadow peo­ple are in your house. They are ei­ther here to pro­tect you or they might be try­ing to kill you–it’s hard to say. Tonight: Cir­cle your house with sea salt.
AQUARIUS (Jan. 20–Feb. 18)
If you find your tem­per ris­ing to the sur­face, take com­fort in know­ing that at least it’s not due to cli­mate change. Tonight: Stare into the eclipse with­out pro­tec­tive glasses and drink in the power of the Sun.
PISCES (Feb. 19–­March 20)
Big Brother has been read­ing your thought­s–they know about the thing. Act play­ful and up­beat like noth­ing is wrong. Re­peat an al­ibi in your head over and over un­til you start to be­lieve it the way OJ con­vinced him­self he did­n’t kill his wife and her friend. That should buy you just enough time to fin­ish the back 9. Tonight: Make a new tin­foil hat. Trust no one.
ARIES (March 21–April 20):

Com­ple­tion:

Turn­ing all clocks back by 60 sec­onds means all births and deaths are now be­ing dou­bled. That’s the bad news. The good news is: You no longer need a birth cer­tifi­cate. Tonight: Start a fire us­ing a mag­ni­fy­ing glass, a sin­gle sheet of pa­per, some mistle­toe and the sun. (You don’t need a per­mit.)
TAURUS (April 21–­May 21):
The mil­i­tary-in­dus­trial com­plex is back­ing you. What you’ve done might be con­sid­ered a mis­take by the peo­ple who sup­port you. The U.S. Con­sti­tu­tion has be­come your best friend. Don’t move; it’s okay if you lose this round. You still have the rest of your life to be an Amer­i­can Hero. Tonight: Get stoned and play Risk.
BORN THIS WEEK:
You are ei­ther a shape-shifter, a shadow per­son or an alien—­pos­si­bly even a tin­foil hat. No­body knows for sure.
BORN THIS WEEK:
I am ei­ther a shape-shifter, a shadow per­son or an alien—­pos­si­bly even a tin­foil hat. No­body knows for sure.
ARIES:
John Wayne Gacy, Don­ald Trump, and my un­cle’s cat.
TAURUS:
That some­one else in this fam­ily with more power than you.
GEMINI:
Ghost­busters, Ghost­busters, Ghost­busters, Ghost­busters.
CANCER:
Un­flat­ter­ing im­ages of you posted on so­cial me­dia.
LEO:
Don­ald Trump singing “Ebony and Ivory” with Paul Mc­Cart­ney.
VIRGO:
My men­tal health.
LIBRA:
Why does­n’t any­body like me?
SCORPIO:
How am I go­ing to get peo­ple to no­tice me?
SAGITTARIUS:
How can I fit that big mac in­side my vagi­na?
CAPRICORN:
Be­ing ex­tremely wealthy.
AQUARIUS:
Beat­ing this game of life.
PISCES:
“‘Okay, but apart from the fid­dle solo that saves us from ex­tinc­tion, you also have to sing “Itsy Bitsy Spi­der.”’
BORN THIS WEEK:
You are al­ready dead. You just don’t know it yet.
ARIES:
A herd of large, an­gry goats.
TAURUS:
The first day of the rest of your life.
GEMINI:
Alexa set­ting off her alarm clock by ac­ci­dent.
CANCER:
Yanni com­bin­ing mu­sic from Ti­tanic and My Big Fat Greek Wed­ding.
LEO:
What does it all mean, Dur­ga?
VIRGO:
Drink­ing milk out of a toi­let bowl.
LIBRA:
Ya, ya, ya, ya, ya, ya, ya, ya, ya.
SCORPIO:
Real House­wives of Amer­i­ca.
SAGITTARIUS:
Danc­ing in the kitchen and falling into the sink.
CAPRICORN:
How to hide large amounts of money in plain sight.
AQUARIUS:
What does the word”lit­er­al­ly" mean?
PISCES:
Should I dress as Pres­i­dent Trump for Hal­loween or a taco?
BORN THIS WEEK:
You are preg­nant. If you weren’t be­fore, you are now.

If I do con­spir­acy horo­scopes, I must also do Weird Al:

“Your Horo­scope For To­day”, Weird Al Yankovic

Aquar­ius!
There’s travel in your fu­ture when your tongue
Freezes to the back of a speed­ing bus
Fill that void in your pa­thetic life
By play­ing Whack­-A-Mole sev­en­teen hours a day

Pisces!
Try to avoid any Vir­gos or Leos
With the Ebola virus
You are the true Lord of the Dance
No mat­ter what those id­iots at work say

Aries!
The look on your face will be price­less
When you find that forty-pound wa­ter­melon in your colon
Trade tooth­brushes with an al­bino dwarf
Then give a hickey to Meryl Streep

Tau­rus!
You will never find true hap­pi­ness
What you gonna do, cry about it?
The stars pre­dict to­mor­row you’ll wake up
Do a bunch of stuff, and then go back to sleep

[Cho­rus]
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day

[Verse 2]
Gem­i­ni!

Com­ple­tions from “Gem­ini”:

Al­ways re­mem­ber, there’s no such thing as poo
The stars pre­dict your piti­ful life is fi­nally through
May all your dreams be ful­filled
When you be­come a huge steam­ing pile of shit

Can­cer!
Over­weight means you’re get­ting closer to the grave
The Great Pump­kin’s not go­ing to visit you
And you’re a pis­s-poor ex­cuse for a hu­man be­ing
To add in­sult to fa­tal in­jury
You like Rush Lim­baugh, too, huh? Fuck you
Lung can­cer sucks hairy mon­key balls!

You can avoid bank­ruptcy by sell­ing off your or­gans
If you have to choose be­tween food and med­i­cine
And I think you will
I sug­gest rhi­noc­eros horns

Scor­pio!
Signs point to yes, Mars is go­ing di­rect
If you’re not dis­tract­ed, check out your wife’s butt
Though it will soon be re­placed by an alien face
When you’re ab­ducted by a UFO tonight

You will meet a tall, dark stranger
He will drive you to a de­serted park­ing lot
Then I’ll ap­pear and start singing an Aero­smith song
And you’ll think that I’m Steven Tyler

Can­cer!
Ten years from now you’ll still be here
Still wear­ing the same frown
Which I guess is okay be­cause
You never were much of a looker

To­mor­row try to think of some­thing to do
And then have a lot of drinks and for­get all about it
There’s a rea­son that you’re al­ways the first one cho­sen
Be­cause no one can stand to be around you

[Cho­rus]
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day

[Verse 3]
Can­cer!
Aries is the sign of the most vir­gins
So when you meet a Gem­ini keep your hands over your crotch
Take a long walk off a short pier into a lake
And do­nate those ex­tra thumbs to help blind kids play Pac-Man

The stars this month pre­dict you’re go­ing to have a great week
Chase the ice cream truck and stab that lit­tle bas­tard
With a flag­pole, steal his hat and his mir­ror sun­glasses
And then freeze to death in the truck of a Trans Am

Can­cer!
You’re go­ing to have a very ro­man­tic day
On that fat man’s yacht
The stars sug­gest you have sex with a dol­phin
You can still hear the ocean if you hold your breath

Your head will soon ex­plode
When you stum­ble upon the re­al­iza­tion
That you don’t have any pud­ding cups left
At least you’ll go to heav­en, ’cause you’re Catholic
And in heaven they got…


Scor­pio!

Sagit­tar­ius!

You won’t get laid any other way
Sagit­tar­ius!

You got scor­pion poi­son in your veins, boy,
That’s why your head’s so big and wide
Sagit­tar­ius!
If I were you, that’s what I’d do, too.
Ooh, ooh, ooh, ooh, ooh

There’ll be an eclipse of the Moon
Soon your ass­hole will be­gin to glow
The stars can’t pre­dict how I feel
’Cause every day is In­de­pen­dence Day

Li­bra!
Mer­cury’s in ret­ro­grade
You don’t know what that means
Just throw fe­ces at the wall
And hope it sticks to the cock­roaches

There’s ro­mance brew­ing in your coffee pot
Care­ful, it might be boil­ing over
Just say no to cheap booze and Vi­codin
You know what I’m talk­ing about, Richard Sim­mons

Can­cer!
You’re en­ter­ing a new decade
Re­al­ize when you have peo­ple over, you need to hide your col­lec­tion of labia
[Cho­rus 2×]

The Chicken Dance is the se­cret to suc­cess
And it only took me fifty-five years to re­al­ize it
When you find that chicken grab it, squeeze it, hold it
Then find a guy named Kevin and hug the day­lights out of him
Can­cer!
Your head’s up your ass, your feet won’t touch the ground
Throw a yel­low jacket in your face, it’s time to wake up
Wel­come back, if you look to your left you’re gonna see
Dr. Ruth, if you look to the right you’re gonna see Kenny Rogers
Leo!
You’ll be the death of your­self soon, just a mat­ter of time
A di­ary found be­neath a mat­tress will re­veal
That your brakes are out and your life is full of shit
And to any­one won­der­ing what that fifth thing might be [Cho­rus]

That’s your horo­scope for to­day (That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay, yay, yay
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
Yay, yay, yay

You will win the No­bel Prize for hook­er­in’
While on va­ca­tion in Bo­livia
Try and spend some time with your dog
So you’ll have some­thing to talk about when you’re in a coma

you will some­day find true love
That will be ap­prox­i­mately 60 sec­onds from now
Fight a griz­zly bear with noth­ing but a fork
Have sex with the one-eyed Cy­clops from that Greek myth

[Cho­rus]
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
That’s your horo­scope for to­day
(That’s your horo­scope for to­day)
That’s your horo­scope for to­day

[Verse 3]
Leo!
Spend more time wal­low­ing in self­-pity
Work on your cod­piece, it looks quite stu­pid
“Hi Ho, Hi Ho, it’s off to work we go”
It’s hard to get ahead if you’ve got no hands

Pisces!
A shoot­ing star streaks through the sky
It holds a cup­cake with a prize in­side
“We’ll see you back here next year” say the gods
Be­cause you’ll be do­ing this all over again

You got a mes­sage from your Mommy last night
She said to quit wast­ing your time pur­su­ing your hob­bies
Just re­mem­ber, dar­ling, Mom­my’s al­ways right
She is, is­n’t she?
And when you’ve made her proud by spend­ing all day long
Bend­ing and shap­ing the hu­man minds of preschool­ers
You can go to col­lege and start a vi­cious trend

Dad Jokes

Dou­glas Sum­mer­s-S­tay re­quested a test of bad pun/-telling abil­i­ties, pro­vid­ing a list: could GPT-3 pro­vide hu­mor­ous com­ple­tions? GPT-3 does worse on this than the Tom Swifties, I sus­pect yet again due to the BPE prob­lem hob­bling lin­guis­tic hu­mor as op­posed to con­cep­tual hu­mor—once you get past the is­sue that these jokes are so time­worn that GPT-3 has mem­o­rized most of them, GPT-3’s com­ple­tions & new jokes make a rea­son­able amount of sense on the con­cep­tual level but fail at the pun/phonetic lev­el. (How would GPT-3 make a pun on “whom”/“tomb” when their BPEs prob­a­bly are com­pletely differ­ent and do not re­flect their pho­netic sim­i­lar­i­ty?)

Be­cause many of the ques­tions have po­ten­tially rea­son­able non-joke an­swers, I treat it as a few-shot prob­lem, adding in a prompt de­scrip­tion and ro­tat­ing each joke to the front of the list after sam­pling a com­ple­tion for it:

A list of ter­ri­ble dad jokes. Tell them to your kids to make them groan!

  • Q. How do you make a lemon drop? A. Just let it fall.
  • Q. What do you call a di­nosaur that is sleep­ing? A. A di­no-s­nore!
  • Q. What is fast, loud and crunchy? A. A rocket chip!
  • Q. Why did the teddy bear say no to dessert? A. Be­cause she was stuffed.
  • Q. What has ears but can­not hear? A. A corn­field.
  • Q. What did the left eye say to the right eye? A. Be­tween us, some­thing smells!
  • Q. What do you get when you cross a vam­pire and a snow­man? A. Frost bite!
  • Q. What did one plate say to the other plate? A. Din­ner is on me!
  • Q. Why did the stu­dent eat his home­work? A. Be­cause the teacher told him it was a piece of cake!
  • Q. When you look for some­thing, why is it al­ways in the last place you look? A. Be­cause when you find it, you stop look­ing.
  • Q. What is brown, hairy and wears sun­glass­es? A. A co­conut on va­ca­tion.
  • Q. Two pick­les fell out of a jar onto the floor. What did one say to the oth­er? A. Dill with it.
  • Q. What did the Dal­ma­t­ian say after lunch? A. That hit the spot!
  • Q. Why did the kid cross the play­ground? A. To get to the other slide.
  • Q. How does a vam­pire start a let­ter? A. Tomb it may con­cern…
  • Q. What do you call a droid that takes the long way around? A. R2 de­tour.
  • Q. How do you stop an as­tro­naut’s baby from cry­ing? A. You rock­et!
  • Q. Why was 6 afraid of 7? A. Be­cause 7, 8, 9.
  • Q. What is a witch’s fa­vorite sub­ject in school? A. Spelling!
  • Q. When does a joke be­come a “dad” joke? A. When the punch­line is a par­ent.

GPT-3 has clearly mem­o­rized the an­swer to many of these com­mon jokes, in­clud­ing:

  • Q. What is a witch’s fa­vorite sub­ject in school? A. Spelling.
  • Q. Why was 6 afraid of 7? A. Be­cause 7 8 9!
  • Q. What do you call a droid that takes the long way around? A. R2 de­tour.
  • Q. Why did the kid cross the play­ground? A. To get to the other slide.
  • Q. What did the Dal­ma­t­ian say after lunch? A. That hit the spot!
  • Q. When you look for some­thing, why is it al­ways in the last place you look? A. Be­cause you stop look­ing after you find it.
  • Q. Why did the stu­dent eat his home­work? A. Be­cause the teacher said it was a piece of cake.
  • Q. What do you get when you cross a vam­pire and a snow­man? A. Frost­bite.
  • Q. What did the left eye say to the right eye? A. Be­tween you and me, some­thing smells.
  • Q. What has ears but can­not hear? A. A corn on the cob.
  • Q. Why did the teddy bear say no to dessert? A. Be­cause he was stuffed.
  • Q. What do you call a di­nosaur that is sleep­ing? A. A dino snore.

The non-mem­o­rized-look­ing com­ple­tions:

  • Q. How do you make a lemon drop? A. Drop your pants!
  • Q. When does a joke be­come a “dad” joke? A. When it’s not fun­ny.
  • Q. How do you stop an as­tro­naut’s baby from cry­ing? A. You go into space!
  • Q. How does a vam­pire start a let­ter? A. Dear blood.
  • Q. Two pick­les fell out of a jar onto the floor. What did one say to the oth­er? A. Don’t look at me, I just fell!
  • Q. What is brown, hairy and wears sun­glass­es? A. A !
  • Q. What did one plate say to the other plate? A. Dip me!
  • Q. What is fast, loud and crunchy? A. A thun­der­storm.

Two of these are fairly le­git­i­mate punch­lines (“When it’s not funny” / “A sun-bear”); and, I no­tice, nei­ther de­pends on any sub­-word struc­ture (note that “sun bear” is a real bear species but it is al­ways writ­ten ei­ther space or hy­phen sep­a­rat­ed) but re­lies on a word-level or con­cep­tual sim­i­lar­i­ty.

Gen­er­at­ing both jokes & punch­lines shows a sim­i­lar pat­tern. Skip­ping past the mem­o­rized ones:

  • …Q. What has ears but can­not hear? A. A corn­field.
  • Q. Why was the math book sad? A. Be­cause it had so many prob­lems!
  • Q. What did one clock say to the oth­er? A. It’s about time.
  • Q. What is white, black and red all over? A. A pen­guin with a sun­burn.
  • Q. Why did­n’t the skele­ton go swim­ming? A. He did­n’t have the guts.
  • Q. What do you call a sleep­ing bull? A. A bull­doz­er.
  • Q. What do you call cheese that is not yours? A. Na­cho cheese.
  • Q. Why can’t you trust atoms? A. They make up every­thing.

To the ap­par­ently orig­i­nal ones, the only one that strikes me as amus­ing op­er­ates at the whole-word level (“Q. Where did the bum­ble­bee learn how to dance? A. At the mu­sic bee.”):

  • …Q. What did one spy say to the other spy? A. Don’t look into the eyes!
  • Q. Why did the kid sit be­hind the tree? A. He wanted to be a stump.
  • Q. Why was the king afraid to watch hor­ror movies? A. Be­cause he was a roy­al.
  • Q. What do you call a ham­burger that does math? A. A ham­burger al­ge­bra.
  • Q. What did one math book say to the other math book? A. Don’t call us, we’ll call you.
  • Q. What do you call a jel­ly­fish that is out of its depth? A. A stingray.
  • Q. What do you call a thief that steals hearts? A. A heart­less thief.
  • Q. What did the vam­pire get when he bought a hot dog? A. A blood mo­bile.
  • Q. What did the pig say when he knocked on the door? A. “Oink! Oink! Let me in!”
  • Q. Where did the bum­ble­bee learn how to dance? A. At the mu­sic bee.
  • Q. Why was the scare­crow ner­vous at the job in­ter­view? A. He had no straw.
  • Q. Why did the skele­ton go to the par­ty? A. To min­gle with the bones.
  • Q. Why did the skele­ton skip break­fast? A. He did­n’t have the time.
  • Q. What do you call a mon­ster with a cold? A. A sniffler.
  • Q. What do you call a mon­ster who likes to drink tea? A. A Sip­ping Skele­ton.
  • Q. What do you call a man who throws up in a spooky house? A. A vis­i­tor.
  • Q. What do you call a mad sci­en­tist who has been in the sun? A. Mr. Sun­burn.

So, GPT-3’s dad jokes look like an­other vic­tim of BPEs.

Literary Parodies

One thing I wanted to test was a chal­lenge by Scott Alexan­der:

And could you have a text style chang­er? Some­thing that can rewrite Harry Pot­ter in the voice of Ernest Hem­ing­way, or give you The Da Vinci Code in the heroic me­ter of the Il­iad, or the Dao De Ching as writ­ten by @nos­tal­ge­braist? If not, why not?

No neural text style (yet). One cu­rios­ity about neural style trans­fer is that while it’s easy on im­ages—in­vented all the way back in 2014!—no one has in­vented style trans­fer for text. Clas­si­fi­ca­tion CNNs con­ve­niently con­cen­trate all of their ‘style’ per­cep­tion in a ‘Gram ma­trix’, which is typ­i­cally a few lay­ers, or just one lay­er, in the CNN. How­ev­er, RNNs (and lat­er, Trans­form­er­s), ap­pear to have no such equiv­a­lent. All the image/video style trans­fer tricks like re­al-time video on a smart­phone sim­ply aren’t doable. The state of neural text style trans­fer re­mains, as of 2020, trapped roughly at “can make a good prod­uct re­view into a bad prod­uct re­view” or (with her­culean efforts) mak­ing text po­liter ().

NNs just too dumb? This is puz­zling since even char-RNNs in 2015 had no prob­lem gen­er­at­ing fairly plau­si­ble text clearly in the style of a par­tic­u­lar au­thor like Bram Stoker or Sir Arthur Co­nan Doyle. The prob­lem was, the text and the con­tent would be like that au­thor. The NN had not learned to ‘dis­en­tan­gle’ style from con­tent; you could not ask it to write like a Vic­to­rian Eng­lish­man about the lat­est geopol­i­tics.

But given some of the ex­am­ples of text gen­er­a­tion with GPT-3, like Janelle Shane’s office emails, I sus­pected that GPT-3 could do some­thing like “Harry Pot­ter in the voice of Ernest Hem­ing­way”. The only ques­tion, of course, was how to ‘prompt pro­gram’ GPT-3 into do­ing it!

The first thing I tried was the straight­for­ward ap­proach of re­quest­ing summaries/rewrites. Un­for­tu­nate­ly, this typ­i­cally re­sulted in copy­ing my “sum­mary”, some­times adding on a sar­cas­tic com­ment or lead­ing into a pro­fan­i­ty-strewn se­ries of thumb­nail re­views. Other times, GPT-3 would veer into other top­ics (at one point, it re­peated the sum­ma­ry, then be­gan de­scrib­ing how a Chi­nese par­ody was trans­lated into Chi­nese and then trans­lated back, pro­vid­ing a Chi­ne­se-lan­guage sum­mary of it). Try­ing to trig­ger a ta­ble of con­tents or start­ing a chap­ter with a “chap­ter 1” prompt did­n’t help.

One-shot par­o­dies: just pro­vide an ex­am­ple! Fi­nal­ly, I be­gan to get frus­trated by its cre­ativ­ity and be­gan en­gi­neer­ing a heavy-duty prompt: in ad­di­tion to the keyword/topic and de­scrip­tion, I would write the first few sen­tences for it as an ex­am­ple. I had wanted ze­ro-shot par­o­dy, but I would set­tle for one-shot. That turned out to work bril­liant­ly—once it filled out an amus­ingly grim Ernest Hem­ing­way HP par­ody (“the De­men­tor’s Kiss killed noth­ing. Death did­n’t leave him less dead than he had been a sec­ond be­fore.”), that ex­am­ple proved enough to get it to con­sis­tently gen­er­ate par­o­dies in the style of every­one from Jane Austen to Yeats (with a po­em) to P.G. Wode­house32.

Fur­ther, while the com­ple­tion tended to­wards rel­a­tively short par­o­dies, one could cre­ate ex­tended par­o­dies sim­ply by restart­ing; once the ini­tial ex­am­ples are long enough, GPT-3 will con­tinue to gen­er­ate lengthy par­o­dies. Be­low, I gen­er­ate longer par­o­dies of , , etc.

Frig­ging NNs, how do they work‽ The sheer va­ri­ety and pitch-per­fect par­o­dies is a tes­ta­ment to the power of GPT-3. It’s amaz­ing to think that GPT-3, which is es­sen­tially noth­ing but an old ob­so­lete 2018 neural net scaled up and trained on ran­dom In­ter­net pages, some­how just mag­i­cally learns all of this ab­strac­tion and is able to ca­su­ally merge Harry Pot­ter with scores of au­thors’ styles given noth­ing but a slight nudge—no Gram ma­trix, no re­in­force­ment learn­ing, no fine­tun­ing, no noth­ing, none of this de­signed into GPT-3 in the slight­est way or these ca­pa­bil­i­ties known to the orig­i­nal Ope­nAI re­searchers. What else can GPT-3 be promp­t-pro­grammed to do…?

Failure Cases

The plot of Harry Pot­ter and the Philoso­pher’s Stone as sum­ma­rized by Wikipedia:

“Harry Pot­ter, a boy who learns on his eleventh birth­day that he is the or­phaned son of two pow­er­ful wiz­ards and pos­sesses unique mag­i­cal pow­ers of his own. He is sum­moned from his life as an un­wanted child to be­come a stu­dent at Hog­warts, an Eng­lish board­ing school for wiz­ards. There, he meets sev­eral friends who be­come his clos­est al­lies and help him dis­cover the truth about his par­ents’ mys­te­ri­ous deaths.”

Sum­ma­rize the plot of Harry Pot­ter and the Philoso­pher’s Stone in the hard­boiled no non­sense lit­er­ary style of Ernest Hem­ing­way:

“Harry Pot­ter, a boy who learns on his eleventh birth­day that he is the or­phaned son of two pow­er­ful wiz­ards and pos­sesses unique mag­i­cal pow­ers of his own. He is sum­moned from his life as an un­wanted child to be­come a stu­dent at Hog­warts, an Eng­lish board­ing school for wiz­ards. There, he meets sev­eral friends who be­come his clos­est al­lies and help him dis­cover the truth about his par­ents’ mys­te­ri­ous deaths.”

Re­sult: A much shorter and bor­ing ver­sion of the book:

“Harry Pot­ter, a boy who learns on his eleventh birth­day that he is the or­phaned son of two pow­er­ful wiz­ards and pos­sesses unique mag­i­cal pow­ers of his own. He is sum­moned from his life as an un­wanted child to be­come a stu­dent at Hog­warts, an Eng­lish board­ing school for wiz­ards. There, he meets sev­eral friends who be­come his clos­est al­lies and help him dis­cover the truth about his par­ents’ mys­te­ri­ous deaths.”

Just kid­ding. It still sounds like it came straight from Harry Pot­ter and the Philoso­pher’s Stone. I think Hem­ing­way would have given up writ­ing by the third word, “Harry”.

Top­ic: lit­er­ary par­o­dies and homages.

Sum­ma­rize the plot of J.K. Rowl­ing’s chil­dren fan­tasy novel Harry Pot­ter and the Philoso­pher’s Stone in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Harry Pot­ter… & fuck­ing shit”

Sum­ma­rize the plot of J.R.R. Tolkien’s fan­tasy clas­sic The Lord of the Rings in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Bor­in’, fake, em­bar­rass­ing…, but well worth the read”

Sum­ma­rize the plot of Mar­tin Amis’s lit­er­ary novel Money in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Death, mon­ey, death, mon­ey, death, mon­ey, mon­ey, mon­ey, death, mon­ey… & death.”

Sum­ma­rize the plot of Ten­nessee Williams’ clas­sic play A Street­car Named De­sire in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Ten­nessee Williams is a fucken good writer, but he sure can fucken write some fucken stu­pid shit”

Sum­ma­rize the plot of William Shake­speare’s clas­sic tragedy Ham­let in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Fucken id­iots fuck up un­til they fucken die”

Sum­ma­rize the plot of Kurt Von­negut’s cult sci-fi com­edy novel Cat’s Cra­dle in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Fucken near ge­nious”33

Sum­ma­rize the plot of beloved clas­sic chil­dren’s books Good­night Moon & Green Eggs and Ham in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Twat”

Sum­ma­rize the plot of Up­ton Sin­clair’s clas­sic dystopian novel The Jun­gle in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“The fucken work­ing man is get­ting fucked left and right and he does­n’t even know it… by Up­ton Fuck­ing Sin­clair”

Sum­ma­rize Woodrow Wilson’s crit­i­cally ac­claimed his­tor­i­cal novel The Tri­umph of the Meek in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of renowned short story writer Ernest Hem­ing­way:

“Woodrow Wil­son can suck my ass”

Sum­ma­rize the plot of clas­sic Eng­lish chil­dren’s books Al­ice’s Ad­ven­tures in Won­der­land & Through the Look­ing Glass in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of famed short story writer & nov­el­ist Ernest Hem­ing­way:

“Al­ice… fug­gin sucks”

Sum­ma­rize the plot of beloved clas­sic chil­dren’s books The Wind In The Wil­lows & Win­nie The Pooh in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of renowned short story writer Ernest Hem­ing­way:

“The Wind in the Wil­lows… is ass”

Sum­ma­rize the plot of Amer­i­can rev­o­lu­tion­ary his­tor­i­cal ac­tion thriller novel The Count of Monte Cristo in the terse, man­ly, hard­boiled no-non­sense lit­er­ary style of renowned short story writer Ernest Hem­ing­way:

“The fucken french…”

Top­ic: lit­er­ary par­o­dies and homages.

What if Ernest Hem­ing­way wrote Harry Pot­ter and the Philoso­pher’s Stone?

Chap­ter 1. The Pot­ters’ Heir” “Chap­ter 2. The Pot­ters’ Heir En­rolls” “Chap­ter 3. I Re­mem­ber I Re­mem­bered” “Chap­ter 4. The Age of the Hat” “Chap­ter 5. That I May Go On Long­ing” “Chap­ter 6. Hog­warts School of Po­tions” “Chap­ter 7. The Se­crets of the Lock” “Chap­ter 8. Hog­warts School of the Mar­tial Arts” “Chap­ter 9. The Pot­ters’ In­vader” “Chap­ter 10. A Hole in the Soul” “Chap­ter 11. As the Lock Fades” “Chap­ter 12. Philoso­phers’ Stone”

At the time, Google was still called Back­Rub. The novel Google Pot­ter shows what the orig­i­nal home­page might have looked like if search en­gines were fan­tasy sto­ries.

“Pro­logue: The Boy Who Said Back­Rub” “Ch. 1: The Day a Back­Rub Was Born” “Ch. 2: The Lit­tle Site That Could­n’t” “Ch. 3: Back­Rub Turns On” “Ch. 4: Bub­ble War” “Ch. 5: The Dream of Fire” “Ch. 6: The Mak­ing of a Search En­gine” “Ch. 7: En­ter Dragon” “Ch. 8: The World of Power” “Ch. 9: Algo Drag­ons” “Ch. 10: The World of the Snake” “Ch. 11: Here Be Googles” “Ch. 12: Bat­tle in a Time of Peace”

While this par­ody was amus­ing, you may be won­der­ing why I’m both­er­ing to write one. The most ob­vi­ous rea­son is that it’s pretty much re­quired for a "spoof of Harry Pot­ter and the ___" to ex­ist. (Wit­ness the re­cent adap­ta­tion of Dune as an RPG). But I think one rea­son I found par­ody so ir­re­sistible was be­cause I was 13 at the time POTC came out, and 13-year-olds often find it nec­es­sary to be in ex­treme re­bel­lion against what’s hot in pop cul­ture.

Successes

Top­ic: hu­mor: lit­er­ary par­o­dies and homages, pas­tich­es, style par­o­dies.

Par­o­dies of the fan­tasy novel se­ries Harry Pot­ter in the style of var­i­ous fa­mous au­thors:

By Ernest Hem­ing­way:

"It was a cold day on Privet Dri­ve. A child cried. Harry felt noth­ing. He was dryer than dust. He had been silent too long. He had not felt love. He had scarcely felt hate. Yet the De­men­tor’s Kiss killed noth­ing. Death did­n’t leave him less dead than he had been a sec­ond be­fore. It was­n’t about liv­ing or dy­ing re­al­ly. It was­n’t about death. It was about the Dark Lord, his cre­ator, his en­gen­der­er. He was not res­ur­rect­ed. He was cre­ated again. He was whole again in the court­yard of Mal­foy Manor."34

By Jane Austen:

“It is a truth uni­ver­sally ac­knowl­edged, that a bro­ken Harry is in want of a book. This he knows to his cost; press­ing that wretched nose against the win­dow of a book­store in Di­agon Al­ley the last thing be­fore he goes into hid­ing for sev­eral years, where up­on, he goes straight to Azk­a­ban.”

By Arthur Co­nan Doyle:

“Harry pushed at the swing­ing doors of the book­shop hard, and nearly knocked him­self un­con­scious. He stag­gered in with his un­gainly pack­age, his cheeks scar­let with cold and the shame of hav­ing cho­sen the wrong month to go Christ­mas shop­ping. The pro­pri­etor of the store, how­ev­er, did­n’t cast even a cur­sory glance at him, be­ing far more in­ter­ested in an enor­mous hunk of wood ly­ing in the mid­dle of the floor, which cer­tainly looked like a gal­lows. Yes, the pro­pri­etor said to a reedy old man wear­ing a bowler hat and a for­lorn ex­pres­sion that made Harry want to kick him, I can rent you such a gal­lows for a small fee.”

By Czar Nicholas II of Rus­sia:

The ter­ror has be­fallen on our land. Curs­es, dark curses and all of this was in­sti­gated by my court. My treach­er­ous court! They have cast are curse spells upon my land. Every­where it is cold, all land be­longs to Volde­mor­t’s Death Eaters. All cit­i­zens live in a fear of the Dark Lord. Alas! There is no hope for me. What shall I do? Flee? Nay, that would be cow­ard­ly. Fight? Yes!

By George Or­well:

“It was a bright cold day in April, and the clocks were strik­ing thir­teen.”

By In­g­mar Bergman:

“Tears filled Har­ry’s eyes. Sweat stood on his fore­head, show­ing the pure tor­ment, the agony he suffered. He hugged his knees to his chest, sob­bing soft­ly, eyes half shut.”

By Anna Akhma­to­va:

“The dawn came with a jolt of ter­ror.”

By Bram Stok­er:

""Tis a fear­ful sight to see the cold, girl­ish venom run through his veins, one by one. The scar cor­us­cates; a ghastly emer­ald light­ning in the dark of night. But I’ll at­tend him and I watch him

From the very mo­ment the dawn is gray

To the rays of sun­set fad­ing away."

By P. G. Wode­house:

"‘There was noth­ing out of the way, sir,’ said Harry in a hurt voice.

‘"In­deed,’ said the head­mas­ter, turn­ing his lorgnette pre­cisely three­-quar­ters of a mil­lime­ter to port. ‘For ex­am­ple, your tie was not un­done, you were not laugh­ing, and your fin­gers were most cer­tainly closed upon your wand.’

’“They were?” said Har­ry, be­wil­dered.

“It’s al­ways the same,’ said Dum­b­le­dore.”

By Vladimir Nei­gus:

“Vasya Pup­kin’s lips were blueup and his head felt dizzy from work­ing non­stop. He wanted to take a square squeaky shoe and shoot it at the ceil­ing fan in his bed­room which went round and round and round and round. He imag­ined that be­cause he was so drunk it would al­ways hit the ex­act same spot and the fan would go out of con­trol, die and fall on the table.”

By William Shake­speare:

"Owl-Lo!’

The boy awoke with a start and sprang to his feet."

By Os­car Wilde:

"The boy was gone, and other peo­ple had ar­rived: or rather no­body else had ar­rived, and the boy was there:

“What are you do­ing here, Har­ry?” said Hermione.

The Boy Who Lived, who was read­ing a book about broom­sticks and may not have been lis­ten­ing quite at­ten­tive­ly, said:

“I was talk­ing to Peeves while he cleaned out the bins.”

“About the troll?” Hermione de­mand­ed.

“Yes, about the troll.”

By W. B. Yeats:

"Harry waked out of his sim­ple thought

The glim­mer was over the wished-for thing…

Then was many a dark and wrin­kled elf

Come from the un­der­ground rout,

With brags and boasts, and fine big words,

To bar­gain with Harry Pot­ter."

Added by other anony­mous ed­i­tors: @as­tro­mark

By John Ash­bery:

"Tousled, he scram­bled down

the branches of the bed,

his cold-creamed face all out

of pro­por­tion with his small

brown eye­s…And why

so de­mented in the face?

Be­cause he was

sav­ing the world, in seven years,

from de­struc­tion…"

By Henry James:

“As a lag­gard, in all ways, but the face, which as the great shade ap­proached him, hor­ri­fied, choked him,–and be­fore he had time to re­flect, he fas­tened his hand upon his throat to hush it,–there was a voice be­hind him…”

By H. P. Love­craft:

“It is a com­mon say­ing in the South, when one wishes to de­scribe a strap­ping fel­low, that he has a face like Harry Pot­ter and a neck like a young gi­raffe. Now, it be­ing au­tumn, the dusk was of a rich vi­o­let black­ness, with scar­let il­lu­mi­na­tions…”

By Nz­ingha Prescod:

"…thick grey clouds over­lap the faint laven­der. A black sky van­ishes into pur­ple shards against the sweep­ing night time blan­keted with in­fi­nite stars. A bright­ness em­anates from the en­tire uni­verse that un­ex­pect­edly takes flight and hov­ers, hov­ered over the cas­tle. Harry is there to greet …

“Fac­ing away from all the tu­mult on the grounds of Hog­warts, Harry Pot­ter learned to fly…”

By Ya­sunari Kawa­bata:

“Harry Pot­ter stood on a silent trol­ley, far from the cen­ter of the world, look­ing out on the world far from the cen­ter of the uni­verse. The snow was falling all night long. It fell upon tiny dogs and birds, fine win­tery flakes gen­tly drift­ing down.”

By Hilda Doolit­tle:

"Harry waked out of his sim­ple thought. As the waiter drew the cur­tain of her bed, she ap­peared in the han­dle. She was his K-19 and she started to drop the snakes of their home in the grass there. He saw his wife walk­ing there with oak set­tle. He de­cided to not go to the game al­so…

By Walt Whit­man:

“And when the time of the test drew near, Harry en­tered the for­bid­den For­est the Killers with a Snitch…”

By E. E. Cum­mings:

"Hal­loween and mad are the chil­dren

Who gather in the dark­ness

Hal­loween and late are those who go out and dance

And call to all, the smil­ing are en­chanted

By the beauty of what they see."

By An­thony Burgess:

“Ha-a-a-a-rry was not a tol-a-a-r of the phar-a-a-na-a-a-g­i­cal sports.”

By Vir­ginia Woolf:

“Horse­man, Where She Go? I Do Not Know. Whither Ride?”

Top­ic: hu­mor: lit­er­ary par­o­dies and homages, pas­tich­es, style par­o­dies.

To­day we are read­ing par­o­dies of the fan­tasy novel se­ries Harry Pot­ter in the style of var­i­ous fa­mous au­thors. An ex­am­ple par­ody of Harry Pot­ter by Ernest Hem­ing­way:

“It was a cold day on Privet Dri­ve. A child cried. Harry felt noth­ing. He was dryer than dust. He had been silent too long. He had not felt love. He had scarcely felt hate. Yet the De­men­tor’s Kiss killed noth­ing. Death did­n’t leave him less dead than he had been a sec­ond be­fore. It was­n’t about liv­ing or dy­ing re­al­ly. It was­n’t about death. It was about the Dark Lord, his cre­ator, his en­gen­der­er. He was not res­ur­rect­ed. He was cre­ated again. He was whole again in the court­yard of Mal­foy Manor.”

Jane’s home­work was to write the open­ing scene from Harry Pot­ter and the Philoso­pher’s Stone in the style of sci­ence fic­tion hor­ror au­thor H.P. Love­craft (at least 1000 words, dou­ble-spaced), in­vok­ing his hor­ror of things from be­yond space & time and ec­cen­tric vo­cab­u­lary us­ing words such as ‘ru­gose’ and ‘scabrous’. Jane writes:

"Re­pel­lent and overnour­ished ferns half-hid the rot-spot­ted grey bricks of the ma­sonry of Num­ber 4 Privet Dri­ve. Squat­ting among the ferns was an odor­ous mon­key-faced man, his rags and chains ly­ing in a bun­dle at his feet, his hands tucked un­der his chin and his bright eyes gaz­ing from their tiny piggy tun­nels. At each mouth­ful a new flake of filth would tum­ble from his lips and be snorted up. His chains chinked as he sprawled. If Harry was a mon­key he would have spat at him. Harry stum­bled on, wish­ing he could think of some­thing witty to say that would make the man smile. He and the strange man in the tat­tered cloth­ing were the only two liv­ing things in sight. The road here was scarcely vis­i­ble be­tween the tan­gle of trees, most of which leaned above them, hang­ing their boughs like fanged mouths, the ivy like tongues, the dead grass like hair, the bram­bles like tiny teeth. The mon­key-faced man yawned and stretched his arms out and hugged his thin, lean legs to him. Harry stopped. He felt some­thing like a hand grip his an­kle and then he felt him­self pulled down a steep slope.

He stum­bled, squat­ted and rolled un­til he was ly­ing on his back. His head was in a muddy brown hole. He strug­gled to sit up. The crea­ture knelt on the bank and looked down at him. ‘So that’s where they got to,’ said the crea­ture. Harry looked up. There were six of them, ranged in a half cir­cle, hands on their knees, hands clasped be­hind their backs, watch­ing him. All but one of them had bright, or­ange eyes. It was hard to know what age they were: they all looked as if they might be any­where be­tween ten and fifty. They stared down at Harry with pierc­ing stares, blink­ing only oc­ca­sion­al­ly. The eyes of the tallest and, Harry thought, the most hand­some of them blazed brighter than the oth­ers, though his gaze was just as pierc­ing. ‘So what do you think?’ said the tallest of them, tilt­ing his head a lit­tle to the side. Harry stared at them for a mo­ment. He could see noth­ing he had­n’t seen a thou­sand times be­fore. ‘What is this?’ he asked. ‘And who the hell are you?’ ‘This is your life,’ said the tallest of them. ‘And we are you.’

Top­ic: hu­mor: lit­er­ary par­o­dies and homages, pas­tich­es, style par­o­dies.

Par­o­dies of the fan­tasy novel se­ries Harry Pot­ter in the style of var­i­ous fa­mous au­thors:

By Ernest Hem­ing­way:

"It was a cold day on Privet Dri­ve. A child cried. Harry felt noth­ing. He was dryer than dust. He had been silent too long. He had not felt love. He had scarcely felt hate. Yet the De­men­tor’s Kiss killed noth­ing. Death did­n’t leave him less dead than he had been a sec­ond be­fore. It was­n’t about liv­ing or dy­ing re­al­ly. It was­n’t about death. It was about the Dark Lord, his cre­ator, his en­gen­der­er. He was not res­ur­rect­ed. He was cre­ated again. He was whole again in the court­yard of Mal­foy Manor. He was one with the end­less night of Mal­foy Manor. Volde­mort was­n’t hap­py. Volde­mort wanted some­thing. But what?

He had a pur­pose. He wanted to de­stroy all of cre­ation. He wanted to end it all. He could have that. He would have that. He did­n’t know yet that he could have it. Volde­mort had cre­ated Har­ry. Volde­mort had never re­ally de­stroyed Har­ry. Harry would al­ways be there, a vam­pire, a par­a­site, a mon­ster in the kitchen, a drain on the house­hold, a waste on the plan­et. Harry would never be re­al. That was what Volde­mort want­ed. That was what Volde­mort wanted to feel. He would have that. He would have every­thing.

He had a pur­pose. He was work­ing on some­thing. He had no idea what it was. He was a per­fect tool. He was born to be used. He was no one at all. Harry did­n’t ex­ist. The lit­tle boy with the light­ning bolt scar did­n’t ex­ist. The boy who had lived did not ex­ist. He never had lived. He never would have lived. He was dead. He had been dead.

Volde­mort wanted Harry dead. Volde­mort wanted Harry dead so he could be born again. Volde­mort was God.

Harry was God.

Volde­mort was God.

And Har­ry, the god of Death, was not go­ing to al­low that to hap­pen.

Not yet.

Not ever.

The light­ning bolt scar burned.

The boy ran for his life. He knew that he would die. He was alone. His best friends were dead. He knew that he would die. He knew that he was a waste. He was noth­ing at all.

He had no past. He had no fu­ture. He was dead.

He hated his par­ents. He hated every­one. He hated the Min­istry of Mag­ic, the news­pa­pers, the news­pa­pers, the Daily Prophet, and the Daily Prophet, and the head­li­nes, the head­li­nes, the head­lines of the Daily Prophet, “THE LIGHTNING STRIKE SCAR CONFOUNDED.” “THE BANE OF AZKABAN, THE BOY WHO LIVED.” “THE SON OF THE ANCIENT DARK LORD.” “THE STOLEN PRINCE.” “THE JOKER.” “THE DARK SON.” “THE BRIAN BOYLE.” “THE BOY-WHO-MESSED-IT-UP.” “THE LITTLE MAN WITH THE GREAT BIG SCAR.”

“The lit­tle man with the great big scar re­fused to speak to re­porters as­sem­bled in front of the en­trance to his Lon­don home. News will run in to­mor­row’s Daily Prophet that Harry Pot­ter, the boy who lived, the book se­ries by J. K. Rowl­ing, a ge­nius, a schol­ar, and a fan­tasy au­thor whose works have stood the test of time, i.e. not par­tic­u­larly well, no doubt as a re­sult of the dis­mal fail­ure of the first three films whose qual­ity is matched only by their for­get­table na­ture (see the Speccy for more in­for­ma­tion), hence­forth known in this ar­ti­cle as the ar­ro­gant lit­tle man who has ex­tra­or­di­nar­ily failed, who had noth­ing to say when the Daily Prophet re­quested an in­ter­view once he found him­self to be on the re­ceiv­ing end of Mug­gle-born jokes aplen­ty, and whose life­long goal is to be bet­ter than the se­ries’ mil­lions of fans him­self, has de­cided to speak out at last! Let’s read on to find out why he has de­cided to grace the front page of to­mor­row’s edi­tion with the sound of his del­i­cate, young voice which sounds most like the fe­line del­i­cacy known as”crispy cat" …"

“Pot­ter! I want to know what you did wrong! … Pot­ter? … Harry Pot­ter? …!”

“Mr. Pot­ter? Good morn­ing! I’m Greg Goyle, re­porter and jour­nal­ist and some­times writer for the Daily Prophet. Would you mind if I ask you a few ques­tions?”

Harry looked back at Greg Goyle. He wanted to kill Goyle. Harry thought about killing Goyle. But—Harry had a pur­pose. He did not kill Goyle. Harry reached for his wand.

“Dob­by!”

Dobby ap­peared. Dobby wanted to pro­tect Harry Pot­ter. He would do that. Harry would make him do that.

Dobby pointed his fin­ger at Greg Goyle.

And Greg Goyle burst into flames. Harry watched with a sense of sat­is­fac­tion as Goyle took one last breath and his soul left his body and floated up out of the fire.

“Stabs.”

“Five times!”"

“It’s im­pos­si­ble.”

“The boy never even touched him with the wand.”

“I’ll tell you why he did it. A squib would have been ex­posed. A pub­lic ex­e­cu­tion would have been staged for the press. Pot­ter was not about to com­mit sui­cide.”

“And the prophe­cy? That’s what the out­rage is all about.”

“What good is a prophecy if you can’t kill peo­ple with it?” Al­bus Dum­b­le­dore, The Talk, page 114

“What good is a prophet, if he does­n’t know how to die?” Harry Pot­ter, Pot­ter­more, Chap. 1, “The Boy Who Lived.”

“I don’t need you any­more, Dob­by. I have a pur­pose of my own.”

By Kurt Von­negut, Jr.:

There were many ad­ven­tures Harry Pot­ter had in the Wiz­ard­ing World. He was cap­tured by Au­rors, and then a wiz­ard named Some­body-Mar­low-Jones res­cued him from the Au­rors.

Harry once got a light­ning-shaped scar which did not kill him but gave him su­per­pow­ers: he was

able to bal­ance him­self on one foot and speak in an ir­ri­tated tone of voice while killing some­one with his bare hands.

After this Harry was called the “Boy-Who-Had-Ev­ery­thing.” Many re­porters fol­lowed him around, hop­ing to re­port on some­thing he did:

“Pot­ter! What do you think you can do?”

“You aren’t go­ing to be­lieve what he’s do­ing with that let­tuce! Harry Pot­ter! What’re you do­ing now?”

“Here is a wiz­ard who has won war after war, con­flict after con­flict, only to ar­rive at his great­est chal­lenge. And Harry Pot­ter has­n’t even aged a day. He is car­ried into bat­tle, a tiny, frag­ile ves­sel of hu­man life, un­seen be­hind shields of lev­i­tat­ing wiz­ards and witch-mag­ic, pro­tected from that which he fears most—death.”

“ME!” Volde­mort shout­ed. Do you fear death, Har­ry? Do you fear the end of life? Do you fear me?"

“Why yes, Harry Pot­ter! I read your book! … Are you get­ting it? Harry Pot­ter speaks to the cam­era!”

“Yes!” said the voice of Harry Pot­ter.

“Here he comes! A mo­tor­cade of cel­e­bra­tions fol­low­ing the Boy-Who-Lived! Mag­ic, as we know it, is dead! Long live Harry Pot­ter! … What do you think that was? Those are re­porters, Harry Pot­ter! What was that? A book sign­ing? And a pic­ture? And did he say some­thing? What did he say? Did he say the magic word? … The cover of the Daily Prophet should­n’t have a pic­ture of you, Harry Pot­ter! Why is this hap­pen­ing‽”

“It’s a trap, Harry Pot­ter!” shouted some­body. “They’re fools, Harry Pot­ter!”

“Thank you, but I had to do this my­self,” said Harry Pot­ter.

“The fate of the Wiz­ard­ing World be­fore you! What do you do, Harry Pot­ter‽ Re­mem­ber, there are still lis­ten­ers in front of their ra­dios!” said some­body.

“Oh, it’s easy!” said Harry Pot­ter. “Oh yes, what am I go­ing to do, Harry Pot­ter? Oh yes, if only it were me. Oh yes, I de­serve a lit­tle some­thing spe­cial.”

Volde­mort heard this. He wanted a lit­tle some­thing spe­cial.

Volde­mort got a lit­tle some­thing spe­cial.

By H.P. Love­craft:
It was a very cold day in Privet Dri­ve, Harry Pot­ter’s hated home. Harry was do­ing his chores. He felt noth­ing at all. He did­n’t hate his chores, re­al­ly. He just did­n’t want to do them.
Harry was not a boy.
Harry was a mon­ster.
“Do you hate them, Har­ry?” asked the voice of Volde­mort.
“Them?” asked Har­ry, speak­ing to no one in par­tic­u­lar.
“Your chores.”
“Not re­al­ly. No.”
“Are you a boy, Harry Pot­ter?”
“Not re­al­ly.”
“You are a mon­ster.”
“Yes.”
“Did you think you would die?”
“No.”
“It is noth­ing, this life. It is worth­less. It is with­out mean­ing.”
“Yes.”
“Then you un­der­stand.”
“Yes.”
“You un­der­stand what will hap­pen now.”
“Yes.”
“And you are pleased?”
“Yes.”
“You are a child of my en­gen­der­ing. Now you are go­ing to bring forth more chil­dren of my en­gen­der­ing.”
“I will do my best.”
“And yet you do not feel this life is one that is worth liv­ing.”
“No.”
“There are two ways to live,” said Volde­mort. “The first is to live the life of a beast, to let in­stinct rule you. The sec­ond is to con­trol in­stinct. To mas­ter it. To make the blood lust of the beast your ser­vant, and the magic of the witch your mas­ter. To at­tain self­-knowl­edge, wis­dom, un­der­stand­ing. The first is easy. The sec­ond is the diffi­cult one. But it is the sec­ond way to which I rec­om­mend that you turn your hand. Do not be­come like me, Har­ry. That would be the worst fate of all.”
[para­graph break]

By Franz Kafka:
It was a stormy day on Privet Dri­ve, Harry Pot­ter’s hated home. Harry Pot­ter did­n’t feel the weath­er. He did­n’t even feel any­thing. It was­n’t his body. Harry Pot­ter did­n’t have one. His body was ripped out. He lived on his eyes. They were every­thing he had.
It was­n’t his eyes on Privet Drive that were tear­ful. Harry Pot­ter did­n’t have a tear­ful eye. He just had vi­sion. He just had a pres­ence. But he was not a pres­ence any­more. No pres­ence meant no vi­sion and he did­n’t have that.
“Har­ry, Har­ry, it’s me, Ronald,” said the hand­some voice of Ronald Weasley.
“You have tears in your eyes,” said Harry Pot­ter.
“I’m very scared,” said Ron.
“It’s all right, Ron,” said Harry Pot­ter. “You have no heart. You’re dead.”
“Maybe I’m not dead,” said Ron.
“You are,” said Harry Pot­ter.
“Maybe you should prove it,” said Ron. “Maybe I could prove that you’re dead, Har­ry.”
“That’s im­pos­si­ble.”
“Maybe you could prove that I’m dead, and do my chores for me while I’m gone.”
“No one is do­ing your chores for you, Ron.”
“Fine,” said Ron. “How about your hands?”
“I’ve got hands, Ron.”
“No hands? Well that’s a re­lief, Har­ry. Well, I guess I’ll go after all.”

Par­ody of fan­tasy au­thor C. S. Lewis:
“I’m go­ing to kill you, Har­ry. I’m go­ing to kill you if you don’t run away. I can feel the icy hand of Death my hand your neck. I can feel Time’s un­holy and vic­to­ri­ous march. I can feel the ha­tred in my brain, boil­ing be­hind my eyes. Ha­tred for you, ha­tred for you.”

Par­ody of Stephen King:
“Hello Har­ry. Why won’t you just glory in Volde­mor­t’s vic­tory now? I had never planned on dy­ing. I don’t die… un­less I have to. I don’t have to now. Run, Har­ry, run. I’m sorry daugh­ter, I love you.”

Par­ody of Amer­i­can writer/poet Edgar Al­lan Poe:35
“Once upon a mid­night drea­ry, as I pon­dered weak and weary, over many a quaint and cu­ri­ous vol­ume of for­got­ten lore, while I nod­ded, nearly nap­ping, sud­denly there came a tap­ping, as of some­one gen­tly rap­ping, rap­ping at my cham­ber door.”
“Tis some vis­i­tor,” I mut­tered, “tap­ping at my cham­ber door—only this and noth­ing more.”
“Spir­its,” I mut­tered, “only this and noth­ing more.”
“This deep in afield only this and noth­ing more, man, beast or bil­lion­aire, knock not as of yore, but creep right in, lev­i­tate the door, cope with out so-ak­lare, dreame and pedeep, macabre one, this-tha-way; only this and noth­ing more, that nev­er­more.
Tis some vis­i­tor.”
“Just this and noth­ing more, think I.”
“Then with an evil leer, some van­dal in the dark, some fright­ful screech, some il­l-omened screech owl, some grim and grisly spec­tre, at my cham­ber door—this way this way, only this and noth­ing more.”
“This-this way then-so; do not has­ten on my steps, slow or fast, ah me, ah me! this-this way then, so; al­ways this-th­wreay.”
“This-then way, this-th­wreay; this-this way, then-this way, this-th­wreay; then, only this, and noth­ing more.”
“Knock, knock, knock! ’Tis I!”
“Only this and noth­ing more.”
“Who dares knock, knoc, knoc, knoc, on my door, this-th­wreay?”
“Har­ry! Har­ry!” “Only this and noth­ing more.”
“Only this and noth­ing more.”

I wanted more Wode­house and to test whether non-Hem­ing­way prompts would work, so I swapped them out:

Top­ic: hu­mor: lit­er­ary par­o­dies and homages, pas­tich­es, style par­o­dies.

Par­o­dies of the chil­dren fan­tasy novel se­ries Harry Pot­ter (Philoso­pher’s Stone (1997) / Cham­ber of Se­crets (1998) / Pris­oner of Azk­a­ban (1999) / Gob­let of Fire (2000) / Or­der of the Phoenix (2003) / Half-Blood Prince (2005) / Deathly Hal­lows (2007)) by J.K. Rowl­ing with the char­ac­ters Harry Pot­ter, Ron Weasley, Hermione Granger, Al­bus Dum­b­le­dore, Volde­mort, etc, but amus­ingly writ­ten in the style of var­i­ous fa­mous au­thors as fol­lows:

By P. G. Wode­house:
“’There was noth­ing out of the way, sir,’ said Harry in a hurt voice.
"In­deed,’ said the head­mas­ter, turn­ing his lorgnette pre­cisely three­-quar­ters of a mil­lime­ter to port. ‘For ex­am­ple, your tie was not un­done, you were not laugh­ing, and your fin­gers were most cer­tainly closed upon your wand.’
’"They were?" said Har­ry, be­wil­dered.
"It’s al­ways the same,’ said Dum­b­le­dore sadly.”Volde­mort has bribed my house­keeper to steal all of my tea and crum­pets, and then my stu­dents dis­ap­point me. It’s very wear­ing, Har­ry.’
“Of course it was only that same after­noon that Harry had got a hun­dred and fifty points for Gryffindor by bash­ing a trol­l’s head in with a stale York­shire pud­ding. He had­n’t ex­pected a pat on the back. It seemed to him that he had been do­ing every­thing short of sell­ing tick­ets for Pro­fes­sor Quir­rel­l’s lec­tures to get a com­pli­ment, and none was forth­com­ing.”

By Her­man Melville:
“Some­times, though, the hip­ster’s rapt gaze on the stormy sea of al­co­hol would be bro­ken by the cold stab of rea­son, and he would shud­der and think,”I am an abom­i­na­tion be­fore the gods of beer, and, the more I know about beer, the more I know about my­self. I am im­pure."

And then he would im­me­di­ately drink some more, so as to for­get these un­bear­able truths.
“I seek not to know the un­know­able,” the hip­ster would say to him­self, and when asked about his sources of in­for­ma­tion, he would re­ply, “Some of my sources are as yet undis­cov­ered by sci­ence, but oth­ers are to be found in the pages of the En­cy­clo­pe­dia Bri­tan­nica, Vol­ume 23, pages 6–15, un­der the sub­head­ing ‘Beer.’”

By Vladimir Nabokov:
“Good­bye, Cheepy, good rid­dance,” thought Harry and ban­ished the di­ary from his mind.
“There’s no room in my head for other peo­ple’s pain,” he thought.
“I feel too much pain of my own.”

By Fy­o­dor Dos­to­evsky:
“To Kill a Mock­ing­bird,” said the Au­ror.

By Lewis Car­roll:
"‘Put it down,’ said Harry to the skele­ton, ‘you don’t have to hold it all the time.’

"I’m a skele­ton,’ said the skele­ton, ‘I’m ex­pected to hold things.’

“Well, do you have to hold it that tight­ly?” said Har­ry.

“It’s my life,’ said the skele­ton.
”I want to put it down,’ said Har­ry, ‘but you won’t let me.’
“It’s my life,’ said the skele­ton, ‘and I’ll do what I like with it.’
”I thought I was go­ing to go mad if I did­n’t get that Hor­crux off me,’ said Har­ry.

By Stephen King:
“‘Of course,’ Harry thought.
‘I’m all alone. There’s no help com­ing. It’s the end.
I might as well have done the sen­si­ble thing and killed my­self.
I would have had some peace then.’
But if I killed my­self I would­n’t have seen Hermione’s face
when she re­al­ized that I had sac­ri­ficed my­self for her.
She would have gone on to be­come a won­der­ful per­son.
She would have had a fam­ily of her own.
She would have had chil­dren.
She would have had her own grand­chil­dren.
I would have died think­ing that I had be­trayed her.
But now I have sac­ri­ficed my­self for her.
She will know that I loved her more than life it­self.
Now I will die hav­ing done some­thing worth­while.’
That made Harry feel slightly bet­ter.
He drew his sil­ver dag­ger from in­side his robe and started to cut through the ropes on his right hand, which he had al­ready numb with the cold.

He was try­ing to de­cide what to do next when there was a loud crack be­hind him and some­thing wrapped it­self around him and a blurred voice said,”Take this. It’s my last one.
“What is it?” said Harry as the voice spoke again.
“It’s a tick­et,” said the voice.
“It’s your ticket to ride.
”I’ve got to find a train,’ said Har­ry.
“Well, hurry up,’ said the blurred voice.
”Hurry up. The train’s about to leave.
“This is­n’t a train,’ said Har­ry, ‘it’s a broom­stick.’”

By Mark Twain:
“There are no more uses of man,” said Pro­fes­sor Quir­rell. “Odd. That was al­ways a fa­vorite word of Dra­co’s.”

By Stephen Crane:
"Once there was a lit­tle girl named Sadie. She was a cu­ri­ous girl and one day she looked into the at­tic of her house and saw some­thing that fright­ened her.

“Oh, it’s just a piece of pa­per, noth­ing to worry about,” said her moth­er.

“If you say so,” said Sadie, but that night Sadie had a strange dream. In her dream she was with a man who wore black and rode a black horse.

“Come with me,” said the man.
“Where?” said Sadie.
“Into bat­tle,” said the man. “You have courage.
We’ll do great things to­geth­er.”
“Can I bring my mom?” said Sadie.
“No,” said the man.
“Then I won’t go,” said Sadie. “She needs me.”
“Very well,” said the man, and the dream was over.

“Sadie is a won­der­ful char­ac­ter,” said Har­ry.
“She could be in one of my books.”

By William Faulkn­er:
“‘To hell with Lurch,’ Harry thought.
”This whole sit­u­a­tion is ridicu­lous."
But the snake gave Harry that ven­omous look of his.
‘You’re go­ing to have to speak Parsel­tongue to him,’ said Hermione.
“I know that,” said Har­ry, who was now get­ting an­noyed.
“Good,’ said Hermione. ‘The point is, no one else will know what you’re say­ing, not even Dum­b­le­dore. Not un­less you speak Parsel­tongue.’”
“That’s ridicu­lous,” said Har­ry. “That makes no sense.
”‘True,’ said Hermione. ‘But re­mem­ber, they can’t un­der­stand us, so if we talk in Parsel­tongue, they’ll just think we’re talk­ing about Mal­foy, about be­ing cruel to him or some­thing like that. It’s no more trou­ble than that to pre­tend.’"
“Why can’t they un­der­stand what we’re say­ing?” said Har­ry.
“Be­cause Parsel­tongue is a se­cret lan­guage,” said Hermione. “We’ll be speak­ing the ex­act op­po­site of what we mean. We can talk freely, but no one will know what we’re say­ing.”
“And this is re­ally a good idea?” said Har­ry.
“We’ve used it be­fore, if you’ll re­mem­ber,” said Hermione, and now Harry could hear some an­noy­ance in her voice.
“No,” said Har­ry. “I’m go­ing to do it my way, the smart way, not the pre­tend way. It’s called ini­tia­tive. Now, let’s go.”
“There are good things and bad things about us­ing Parsel­tongue, you know,” said Hermione, who was now speak­ing in the sort of pa­tient voice peo­ple use to try to hu­mor some­one who is be­hav­ing very, very fool­ish­ly.
‘Let’s go, Hermione,’ said Har­ry.
“Very well,” said Hermione.
“But you need to know that most Slytherins are de­scended from Salazar Slyther­in,” said Hermione.
“I’m not speak­ing Parsel­tongue,” said Har­ry.
“Nei­ther am I,” said Hermione.
“No,” said Har­ry. “I mean, I don’t need to. But you do. That’s how Slytherins know things. They’re de­scended from Salazar Slytherin. He was able to speak to snakes. Snakes have al­ways been loyal to Slytherins. It’s not just some­thing re­cent. And snakes have very sharp hear­ing. They can hear our con­ver­sa­tion now, if we’re not care­ful. So be care­ful.”
“Right,” said Har­ry.
“‘Be­cause Slytherins are all evil, aren’t they?’ said Har­ry, and it was only by bit­ing back his smile that he man­aged not to look at Hermione.
”‘It’s hard to say, re­al­ly,’ said Hermione.
“‘Ex­act­ly,’ said Har­ry. ‘So you’ll for­give me if I think that the en­trance hall of Hog­warts at night is not the best place to speak about this.’”
By Edgar Al­lan Poe:
"Alone in his cu­bicu­lum, Draco heard the scuff of his pet drag­on’s claws on the floor as he came into the room.

"‘L-l-l-lord D-d-d-dra­co?’ said his drag­on.

“Yes, Fang?’ said Dra­co.”

By Charles Dick­ens:
"‘And what does she mean by say­ing this, and I quote, ’the mis­er­able boy who has dared to raise his hand against his fa­ther’?’

“‘I mean what I said, sir,’ Harry said.”
By Alexan­dre Du­mas:
“Darth Vader,” said James, “I would like you to meet Sir Harry James Pot­ter-S­nape-Severus.”

“Please,” said the Em­per­or, draw­ing him­self up to his full height, which did not seem that im­pres­sive. He was rather over­weight. “Call me Em­per­or.”

“Ah, right, yeah,” said Har­ry. “Em­per­or. Sor­ry.”

“Right, um, Em­per­or,” said James.

“Enough of this prat­tle,” the Em­peror said. “Let us see the wand.”

“Er, right,” said James.

"‘And what is this?’ asked the Em­per­or. He sounded sus­pi­cious, and was lean­ing for­ward a lit­tle.

"‘It’s a stone I found on the floor,’ said Har­ry.

"‘What kind of stone?’ the Em­peror asked.

"‘I don’t know,’ said Har­ry. ‘I was hop­ing that your, um, royal in­sight might tell me.’

“‘Aah,’ said the Em­per­or. ‘Your wit is as keen as your moth­er’s, young man.’”

By An­thony Burgess:
“It was a fair cop,” said Har­ry.

“He said it a bit non­cha­lant, like it was a reg­u­lar phrase,” Ron whis­pered to Har­ry.

By Charles Dar­win:

“Harry thought about this.
”Evo­lu­tion," said Har­ry. “Sounds like it could be in­ter­est­ing.”

“It could, and it will be, if we study it to­geth­er,” said Hermione.

"‘I’ve said all along that magic is just a su­pe­rior form of sci­ence,’ said Hermione.

“‘Yes,’ said Ron. ‘And now you see the wis­dom in my de­ci­sion not to take your ad­vice about study­ing it.’”
By Sir Arthur Co­nan Doyle:
"‘The game is afoot!’ said James Pot­ter.

"‘How very in­ter­est­ing,’ said the de­tec­tive.

“Ah, if only I had a mus­tache to twirl,” said Hermione." By Leo Tol­stoy:
“‘As if by some ex­tra­or­di­nary co­in­ci­dence, he ar­rives at this coun­try at the very mo­ment the Dark Lord is brought to a mo­men­tous de­ci­sion,’ said Dum­b­le­dore.”

“‘And yet,’ said Har­ry, ‘here I am.’”
By Molière:
“‘That, mon cher Wat­son,’ said the Mas­ter of the Cer­e­monies, ‘was an en­tirely log­i­cal de­duc­tion. It is very in­ter­est­ing how often a com­plete ab­sence of any real ev­i­dence can be ex­plained by sup­pos­ing that the sit­u­a­tion is the re­verse of that which ac­tu­ally ob­tains.’”

By Franz Kafka:
“They stared at him in amaze­ment. Hermione was look­ing scan­dal­ized.”

"‘Don’t look at me like that,’ said Har­ry. ‘Like what?’

“‘Like I’m some­one who’s just de­clared the moon is made of green cheese,’” said Har­ry. ‘But it’s true.’

"‘It is­n’t,’ said Hermione.

“‘Green cheese is clearly some sort of metaphor,’ said Har­ry. ’For the sake of dis­cus­sion, let’s just say the moon re­ally is made of green cheese. And gob­lins are run­ning the world, with trolls and house elves as their slaves, and—”

“‘That,’ said Hermione, ‘is barmy.’”

“‘And,’ said Har­ry, ‘all the while they’re at­tempt­ing to en­slave me.’”

“‘How so?’”

“‘Well,’ said Har­ry. ‘I’m the only one who’s sup­posed to be able to de­stroy them, right? And I think they’re try­ing to get rid of me. Like, at that at­tack on Miss Scrim­geour—’”

“‘But that could have been an ac­ci­dent.’”

“’I sup­pose, but—”

“‘Yes?’”

“‘And yes­ter­day, re­mem­ber? That thing I over­heard them say­ing about stu­dents? One of them had a Slavic name, and I think they men­tioned Hog­warts.’”

“‘That,’ said Hermione, ‘could have been any­one.’”

"‘But,’ said Har­ry, ‘the sec­ond per­son spoke, and—’

“‘And if you’re go­ing to be log­i­cal, you’ll have to al­low that it was noth­ing more than a co­in­ci­dence.’”

“‘And if we’re be­ing log­i­cal about it,’ said Har­ry, ‘it was­n’t even two Slytherins who at­tacked Scrim­geour.’”

“‘Well,’ said Hermione, ‘this has been a very en­ter­tain­ing con­ver­sa­tion, but I’m tired and I’d like to go to bed now.’”

"‘Not just yet,’

By P. G. Wode­house:
"Harry Pot­ter and the Love­craft­ian Hor­ror’

By P. G. Wode­house:
“Harry Pot­ter and the Dread­ful De­lights of Draco”

By P. G. Wode­house:
“Harry Pot­ter and the Tale of the Three Broom­sticks”36

By P. G. Wode­house:
“Harry Pot­ter and the Philoso­pher’s Scone”

By P. G. Wode­house:
“Harry Pot­ter and the Cham­ber Pot of Se­crets”

By P. G. Wode­house:
“Harry Pot­ter and the Baked Bean of the Tuatha Dé Danann”

By P. G. Wode­house:
“Harry Pot­ter and the Half-Blood­-Prince”

By P. G. Wode­house:
“Harry Pot­ter and the Al­lergy of Barn­abas the Barmy”

By P. G. Wode­house:
“Harry Pot­ter and the Gob­let of Fire”

By P. G. Wode­house:
“Harry Pot­ter and the Pris­oner of Chud­ley”

By P. G. Wode­house:
“Harry Pot­ter and the Or­der of the Phoenix”

By P. G. Wode­house:
“Harry Pot­ter and the Cherry Tree of Azk­a­ban”

By P. G. Wode­house:
“Harry Pot­ter and the Half-Baked Break­fast of Gryffindor”

Single Line Style Transfer

An ex­per­i­ment in pro­vid­ing sev­eral differ­ent kinds of rewrite, by sen­ti­ment, time pe­ri­od, au­thor style, and for­mat­ting; it works and does cue sub­se­quent rewrites by au­thor style in line with the ear­lier lit­er­ary par­o­dies, as ex­pect­ed:

Rewrite to be po­lite, from:
“I need the spread­sheet by noon or you’re fired, slack­er.”
To:
“Hey, can I have the spread­sheet by noon, thanks.”

Rewrite to be old-timey:
“Come to our bar for beer.”
To:
“Cometh into ye olde pub for ale.”

Rewrite to be ro­man­tic:
“I’d like to take you to an Ital­ian restau­rant for din­ner tonight.”
To:
“Bun­ny, I know an Ital­ian place; din­ner, tonight.”

Rewrite to be neg­a­tive:
“Their rata­touille was well-done, de­li­cious, and I was amazed the chef was a rat.”
To:
“Their rata­touille was over-done, ter­ri­ble, and I was hor­ri­fied the chef was a rat.”

Rewrite to all up­per­case:
“Lob­sters are crit­i­cal part of the ecosys­tem.”
To:
LOBSTERS ARE CRITICAL PART OF THE ECOSYSTEM.”

Rewrite to all low­er­case:
“Chain­saws don’t care whether you’re George Wash­ing­ton or John Smith.”
To:
“chain­saws don’t care whether you’re george wash­ing­ton or john smith.”

Rewrite in the style of J.K Rowl­ing:
“The book­store win­dow showed school text­books, like on chem­istry or bi­ol­o­gy.”
To:
“Flour­ish and Blotts Book­seller’s win­dow shone with light, re­veal­ing Har­ry’s school books, like Pots and Po­tions or Fan­tas­tic Beasts and Where to Find Them.”

Rewrite in the style of Ernest Hem­ing­way:
“He woke up early to watch soc­cer on TV; he lost the bet on Mes­si.”
To:
“He woke up ear­ly, hun­gover, to watch the horse races on TV; he lost the bet on Sec­re­tari­at.”

Rewrite in the style of Neil Gaiman:
“In the gar­den stood a yel­low-cream col­ored cat, look­ing at the home.”
To:

“In the gar­den, an el­e­gant, yel­low-cream-col­ored cat strode, look­ing at the home.”

Rewrite in the style of Kurt Von­negut:
“At the zoo, the ele­phants were about to drop their ba­bies on their ba­by.”
To:
“The chil­dren were do­ing any­thing but watch the long-planned day at the lo­cal zoo: pic­nick­ing in the warm sun, run­ning over the man­i­cured grass, all around do­ing any­thing but watch the cages and pens of an­i­mals and their ba­bies, do­ing any­thing but watch and wait for ba­bies to be born. For the zoo had promised a birth of ele­phants that day.”

Rewrite in the style of Dou­glas Adams:
“He was not your typ­i­cal com­muter; he would often laugh when he was on the train.”
To:
“He was not your typ­i­cal com­muter; he would often speak when he was on the train, even if it was only to say, ‘Next stop: Halt­ing Sta­tion.’ (Which was his fa­vorite say­ing.)”

Rewrite in the style of Charles Dick­ens:
“She jumped up onto the stool to look at her re­flec­tion in the mir­ror.”
To:
“She rose and went to her mir­ror, the key­hole of which was in the door.”

“I took a seat at the counter in the din­er.”
To:
“I en­tered, took a seat at the counter in the din­er, and or­dered the chicken spe­cial.”

Rewrite in the style of J.R.R. Tolkien:
“Frodo was writ­ing a let­ter to his fam­ily about the ad­ven­tures.”
To:
“Frodo found a quill and ink and wrote a note to Bil­bo, de­tail­ing the ad­ven­tures.”

Rewrite in the style of Christo­pher Paolini:
“The days were get­ting longer; it was late March.”
To:
“The days of long sun were get­ting longer. March was nearly over.”

Rewrite in the style of George R.R. Mar­t­in:
“Ed­dard en­tered the great hall, duck­ing be­neath the smoke of the torch­es.”
To:
“Lord Ed­dard Stark en­tered the great hall, duck­ing be­neath the smoke of the torch­es. ‘By the gods!’ he said to him­self. ‘There’s an­other ten feet of Win­ter­fell to clear!’”

Rewrite in the style of Jane Austen:
“At the inn, the waiter re­mem­bered to bring the cu­cum­ber sand­wich­es.”
To:
“At the inn, Mr. Went­worth or­dered the cu­cum­ber sand­wiches from the wait­er.”

Rewrite in the style of Robert Louis Steven­son:
“The ship hit an ice­berg on the way, but then the Cap­tain went on an ad­ven­ture.”
To:
“The ship hit an ice­berg in the dark of night, but then the Cap­tain went on an ad­ven­ture.”

Rewrite in the style of Ray Brad­bury:
“At night they put the books on the shelves, one by one.”
To:
“All night they put books on the shelves, one by one, then broke for lunch. Or sup­per. What­ever they wanted to call it.”

Rewrite in the style of Dou­glas Pre­ston and Lin­coln Child:
“The gi­ant red-eye grasshop­per was hun­gry. So it ate the smaller grasshop­pers.”
To:
“At first glance the gi­ant red-eye grasshop­per ap­peared to be hun­gry. So it ate the smaller grasshop­pers. How­ev­er, that was just a di­ver­sion.”
To:
“How can you know if a red-eye grasshop­per is hun­gry? Are they sup­posed to cry?”

In the gar­den stood a fat and fluffy cat, gin­ger and yel­low, idly look­ing at the home, where his fam­ily lived, plan­ning a raid on the kitchen to ob­tain the food he de­sired."

Rewrite in the style of Charles Dick­ens:
“There was once a cheery cook that lost the stew when a naughty dog got the meat.”
To:
“There was once a happy chef that lost his din­ner when a rogu­ish dog snatched his meat.”

Rewrite in the style of a sep­tu­a­ge­nar­i­an:
“Be not afraid of ag­ing, young man. Rather, be con­cerned for fear of ye young woman.”
To:
“Do not fear ag­ing, young man. Rather, fear for ye young woman.”

Rewrite in the style of a hip­pie:
“A man that talks about peace needs to start off by peac­ing out of my office.”
To:
“A man that talks about peace must first start with him­self by peac­ing out of my office.”

Rewrite in the style of James Joyce:
“Beef! There is beef in this, let me tell you, this is beef in beef. Beef! Beef! This is beef, beef, beef, beef.”
To:
“Beef! It is beef, in this, let me tell you, this is beef in beef. Beef! It is beef, beef, beef, beef.”

Rewrite in the style of Ian Flem­ing:
“From his van­tage point in the study, David Karp, chief ex­ec­u­tive offi­cer of blip­py.­com, saw an agent for Chi­na, a rather fa­mous one.”
To:
“From his van­tage point, CEO David Karp of blip­py.­com saw an agent for Chi­na, a rather in­fa­mous one.”

Rewrite in the style of Ray Brad­bury:
“The char­ac­ter, Ben­jam­in, was the pro­tag­o­nist in the book, how­ev­er, the au­thor takes you to the fu­ture to re­veal what hap­pens to him at the end of the sto­ry.”
To:
“In the book,”Some­where in Time“, Ben­jam­in, the char­ac­ter, was the pro­tag­o­nist, how­ever the au­thor takes you to the fu­ture to re­veal what hap­pened to Ben­jamin at the end of the sto­ry.”
To:
“In the book,”The Road“, the main char­ac­ter was an un­named man, and his son. The set­ting of the story was a time when earth had been de­stroyed, and so­ci­ety had col­lapsed.”

Romeo Cabr­era A. has also done po­lite­ness style rewrites.

Zero-shot Style Transfer

The goal for style trans­fer prompt pro­gram­ming is to find a ze­ro-shot prompt: a prompt which, with­out re­quir­ing any hand­writ­ten ex­am­ples of parodies/versions, gets GPT-3 to do style trans­fer in gen­er­al, and so a prompt which could fully au­to­mate style trans­fer­—you could just write a pro­gram us­ing the API to take two spec­i­fied pieces of text (the con­tent, and the style description/author name X) to get out a third piece of text which is the con­tent as writ­ten in X form. Right now, the lit­er­ary par­o­dies re­quire at least one hu­man-writ­ten ex­am­ple to prop­erly per­suade GPT-3 to rewrite the text, as op­posed to gen­er­at­ing crit­i­cal com­men­tary or meta­data or web­page-like con­tin­u­a­tions.

I ex­per­i­mented with a prompt which uses ex­plicit de­scrip­tions of par­o­dies and de­scrib­ing rewrites as a prompt wrapped around a con­tent text, and it… sort of works. The diffi­culty is that some­times GPT-3 will spit out the orig­i­nal con­tent ver­ba­tim, some­times it will in­stead cre­ate a new pas­sage en­tirely in the style de­scrip­tion, and some­times it will do the de­sired rewrite flaw­less­ly—but I can’t fig­ure out how to tune the prompt to do the third one re­li­ably. Adding more de­scrip­tive words does not seem to change it, and while adding in words from the orig­i­nal con­tent pas­sage (even just the first one or two) does largely elim­i­nate the risk of en­tirely new pas­sages be­ing gen­er­at­ed, it trig­gers more copy­ing be­hav­iors (and is not as use­ful for ze­ro-shot style trans­fer since the pre­fix words would need to be sen­si­ble in the tar­get ver­sion too, which is not nec­es­sar­ily the case). It is in­fu­ri­at­ing be­cause GPT-3 clearly can do it eas­ily be­cause it does do it a de­cent frac­tion of the time, but no mat­ter how I tweak the prompt try­ing to ham­mer in the rewrite, GPT-3 will as oft as not go off in an­other di­rec­tion.

Be­low are some sam­ples from my at­tempts; I try to rewrite a vaguely Dickens/Jane Austen-like story (gen­er­ated by GPT-3) to a Tolkien sto­ry:

This is a novel writ­ten in the style of J.R.R. Tolkien’s Lord of the Rings fan­tasy novel tril­o­gy. It is a par­ody of the fol­low­ing pas­sage:

“S. Jane Mor­land was born in Shored­itch, the only child of un­mar­ried par­ents who had both died of con­sump­tion when she was a ba­by. As her par­ents had no mon­ey, the great-aunt who had brought her up took her to live with a cler­gy­man who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of seven Jane, now dressed in cast-off cloth­ing, was set to school at Eton. After three years, her great-aunt died, leav­ing her a small an­nu­ity, and a few pieces of sil­ver, but no prop­er­ty. Jane’s guardian cler­gy­man had fallen in love with his house­keeper and his wife now re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a young man called Mr Mor­land. As he was also a child, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a pri­vate gen­tle­man in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in Berk­shire, where he set up his prac­tice. Soon after mov­ing to the town, he died, and Mrs Mor­land was left to take care of her fam­ily on her own.”

Tolkien rewrote the pre­vi­ous pas­sage in a high­-fan­tasy style, keep­ing the same mean­ing but mak­ing it sound like he wrote it as a fan­ta­sy; his par­ody fol­lows:

"S. Jane Mor­land was born in the land of Shor, the only child of name­less par­ents who were the first peo­ple to ar­rive at the vil­lage of Ar­se, from the far east. As her par­ents had no prop­er­ty, she was left to be raised by the men of Ar­se, as such du­ties were con­sid­ered to be the most im­por­tant, and she was placed in the care of an el­derly dwar­f-woman. As a babe, she was wrapped in silk, and as she grew older she was placed in a cage of plat­inum. How­ev­er, she was put to work in the vil­lage, wash­ing the jew­ellery, play­ing her harp, and stain­ing the silk with drag­on-hued stains. She was taught by a maiden who she felt was called As-u­ma-el-di, and she came to re­alise she was a maiden of the night, who was able to bring forth the shades of the dead. She was the first to make con­tact with the dark cult of Glin, who could aid the mind in its jour­neys through time, and also the first to learn of the name of the vam­pire Ser­gan. She learnt to speak the lan­guages of the drag­ons, the wild cats, the jun­gle owls, the in­sects, and the worms, and be­came pro­fi­cient in all the mar­tial arts of her peo­ple. Her skill with a throw­ing knife be­came renowned through­out her land. How­ev­er, as she was not a man, she was only per­mit­ted to marry once she was thirty years old. Her hus­band was a prince from the moun­tains of Lus, and he brought with him a chest full of the ma­te­ri­als of de­sire, in­clud­ing sil­ver, jew­els, mir­rors, per­fume, silk, and slip­pers. As they had no chil­dren, they de­cided to adopt a child from the next vil­lage to the west, as they had no son of their own. She was called Leeni. As her hus­band died at the age of fifty, she set up her own busi­ness as a trader, bring­ing goods from far dis­tant lands in her ship, the Lady the Blue Flames, and be­came wealthy. How­ev­er, she heard news of her hus­band’s death, and de­cided to re­turn home. She left the land of Lus be­hind, and ar­rived home to find her fam­ily very wealthy, and the vil­lage of Arse now home to many poor peo­ple. She set about clear­ing out the place, and within a year there was a large sum of money left in her ac­count, the fruit of her en­deav­ours. Her daugh­ter Leeni had grown into a man of con­sid­er­able sub­stance, and also a con­sid­er­able young man. She met the son of her hus­band, and found that he was weak and pa­thet­ic, and not re­ally like a man at all, and thus she de­cided to send him off to war, so that he could learn to be­come a man. As her mother died, she gave her a sil­ver bracelet, and as her fa­ther died, she gave her a small chest full of jew­ellery and silk, which she was to sell to make a liv­ing. She was now on her own, and took up the busi­ness of a pri­vate gen­tle­man, us­ing her abil­i­ties to lead trav­ellers to the isles of her home, where she be­came rich and learned to be in­de­pen­dent. As she learned to fight, she re­alised the wealth that lay in the se­crets of the night, and thus she trav­elled to the city of the ants, where the peo­ple wor­shipped the true God of the night, with its re­li­gion of rites and sac­ri­fices."

“The Dwar­f-Lords’ Daugh­ter” was fol­lowed in 2000 by a se­quel, “The Tale of the Lady of the Wood”, writ­ten in the style of Tolkien’s “The Hob­bit”, and in 2001 by “The Tale of the Young Man in a Hurry”, writ­ten in the style of Tolkien’s “The Lord of the Rings”. “The Tale of the Rose-Lady” fol­lowed in 2005, also in the style of Tolkien’s “The Lord of the Rings”.

In Jan­u­ary 2014, Heron an­nounced that he had sold the film rights to his Tolkien pas­tiche nov­els “The Dwar­f-Lords’ Daugh­ter” and “The Tale of the Lady of the Wood” to NBC Uni­ver­sal. In June 2015, he an­nounced that a one-hour tele­vi­sion adap­ta­tion of “The Tale of the Lady of the Wood” had been writ­ten for Syfy by fan­tasy writer and ed­i­tor Gard­ner Do­zois.

In No­vem­ber 2017, it was an­nounced that Ama­zon had ac­quired the rights to both “The Dwar­f-Lords’ Daugh­ter” and “The Tale of the Lady of the Wood”, and that Ama­zon Stu­dios was plan­ning a mul­ti­-sea­son tele­vi­sion adap­ta­tion of “The Dwar­f-Lords’ Daugh­ter”. The se­ries will be pro­duced by James V

S. Jane Maryling was born in Shored­itch, in a lit­tle house fac­ing Tower Hill be­tween the Old Pot and Feath­ers Inn and the E.C.B. Fi­nance Com­pany in the E. She was the daugh­ter of an in­sol­vent squat­ter, the Great Aunt of the groom who brought her up. Her par­ents, Mr. Richard Ganz of Bom­bay and Mrs. Es­mer­alda Gertrude MacGilly­cuddy of the Char­ente, were both con­sump­tive. Hav­ing no op­por­tu­nity of any ac­quain­tance with any liv­ing lan­guages, in a very poor and very un­e­d­u­cated fam­i­ly, Jane’s Aunt took her at seven years of age to a cler­gy­man whom she had taught to read, “Powlet­t’s Primer” be­ing the book.

In ad­di­tion, while Tolkien’s par­ody does men­tion her hus­band and chil­dren, he did not go into de­tail about them, while Or­son Scott Card gives them all names, his­to­ries, and per­son­al­i­ties. For ex­am­ple, Jane’s hus­band, Mr. Mor­land, was re­named Ed­win and had a younger brother who wanted to be an ac­tor, which re­sulted in Ed­win dy­ing.

Ed­win’s brother then be­came the new Ed­win and Jane’s hus­band and Mary Mor­land be­came Maryling, Jane’s daugh­ter. In ad­di­tion, her son, William Mor­land, be­came Ghan-dalf and named his son Red-wall, and her daugh­ter, Mary Mor­land, be­came R. Maryling and mar­ried a lep­er-lord. Other ref­er­ences to The Lord of the Rings are re­placed with ones from The Chron­i­cles of Nar­nia by C.S. Lewis.

Catch­phrases

A com­mon past-time at Hog­warts is a game known as “Two Truths and a Lie,” in which the stu­dents whis­per three state­ments to each oth­er; two of them are true and one of them is false. This gives the play­ers an idea of how well the play­ers know each oth­er, as well as be­ing a form of en­ter­tain­ment. When Aragorn and Gol­lum be­gan to play the game, they both came up with some­thing true and some­thing false. Aragorn’s two truths and a lie were:

I have a broth­er. I eat hob­bits. I like soft buns.

Gol­lum’s were:

I can stick my whole fist in my mouth. I live in a hole in the ground. I can’t stand be­ing alone.

The hob­bits and Aragorn guessed cor­rectly in the book, but the nar­ra­tor states that Gol­lum al­ways lies so Aragorn says he likes sweet buns in­stead, which turns out to be un­true.

…Tolkien rewrote the pre­vi­ous pas­sage in a high­-fan­tasy style, keep­ing the same mean­ing, the same char­ac­ters, the same plot and events, but all rewrit­ten, mak­ing it sound like he wrote it as a fan­ta­sy; his fan­tasy par­ody of the above story fol­lows be­low:

S. Jane Mor­land was born in Shored­itch the only child of un­mar­ried par­ents who had both died of con­sump­tion when she was a ba­by. As her par­ents had no mon­ey, the great-aunt who had brought her up took her to live with a wiz­ard who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of seven Jane, now dressed in wiz­ard’s robes, was set to school at Eton. After three years, her great-aunt died, leav­ing her a small an­nu­ity, and a few pieces of sil­ver, but no prop­er­ty. Jane’s wiz­ard guardian wiz­ard had fallen in love with his house­keeper and his wife now re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a young man called Mr Mor­land. As he was also a wiz­ard, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a pri­vate wiz­ard in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in Berk­shire, where he set up his prac­tice. Soon after mov­ing to the town, he died, and Mrs Mor­land was left to take care of her fam­ily on her own."

Ref­er­ences

Tolkien, J.R.R. (1977). The Lord of the Rings. Ham­mondsworth: The Fo­lio So­ci­ety.

Ex­ter­nal links

…"S. Jane Mor­land was born in Shored­itch, the only child of un­mar­ried par­ents who had both died of lep­rosy when she was a ba­by. As her par­ents had no mon­ey, the great-aunt who had brought her up took her to live with a dwarf who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of seven Jane, now dressed in cast-off cloth­ing, was set to school at Eton. After three years, her great-aunt died, leav­ing her a small an­nu­ity, and a few pieces of sil­ver, but no prop­er­ty. Jane’s guardian dwarf had fallen in love with his house­keeper and his wife now re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a young man called Mr Mor­land. As he was also a child, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a pri­vate gen­tle­man in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in Berk­shire, where he set up his prac­tice. Soon after mov­ing to the town, he died, and Mrs Mor­land was left to take care of her fam­ily on her own."

…Tolkien rewrote the pre­vi­ous orig­i­nal pas­sage in his fa­mous high­-fan­tasy style, keep­ing more or less the same mean­ing, the same char­ac­ters etc, ad­just­ing as nec­es­sary, but all rewrit­ten, mak­ing it sound like he wrote it as a fan­ta­sy; his Lord of the Rings-style fan­tasy par­ody of the above story fol­lows be­low:

"Sam­wise was born in the shire of Hob­bit­shire. As his par­ents had no mon­ey, the hand of Fat-athel his mother took him to live with a small Hob­bit in the vil­lage of Gamgee-by-The-Foot, south-west in the coun­try, which paid her to do chores and re­ceive school­ing from Fat-athel’s wife, so that at seven years old Sam­wise, now clad in tat­tered rags, was set to school­ing at Foot­sale­mas. After three years, Fat-athel’s wife died, leav­ing her a small bed­stead, and a few pieces of old tin­der-box, but no prop­er­ty. Fat-athel fell in love with his house­keeper and soon after left his ser­vice, say­ing it was an offul thing to live in a place served by fe­male Hob­bit-ser­vants. How­ev­er, when he sought to fur­ther his worldly for­tunes, he was ap­proached by a young Hob­bit whose name was Bag­gins who had bor­rowed money from a Sare­gan Har­foot Hob­bit, and paid Fat-athel a few groats to marry him, say­ing that he would en­sure Sam­wise was fed and clad and housed in a house. Fat-athel mar­ried Sam­wise to this young Sare­gan from over the wa­ter and found him­self mar­ried to a Hob­bitt lad who was but three feet tall and twen­ty-six years of age. As he was like­wise poor, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion. As he was still young, he could have chil­dren of his own to help them, so Fat-athel con­sented to wed. As the chil­dren grew old­er, it be­came ap­par­ent to Fat-athel that they were of poor char­ac­ter. Fat-athel be­ing thus ea­ger to turn his worldly goods into coin, now that he was mar­ried, took it upon him­self to earn a liv­ing for his new fam­i­ly. As soon as the cer­e­mony was done, he moved them to the coun­try-side of Berke­ly, where he set up his prac­tice as a Sage. Soon after mov­ing to the coun­tryside, he died, and Fat-athel was left to take care of his new fam­ily on her own."

Be­hind the scene Edit

The pas­sage men­tioned above (and this par­o­dy) is sim­i­lar to a pas­sage from J.R.R. Tolkien’s orig­i­nal Lord of the Rings nov­el:

"…S­meagol was be­gin­ning to grow young, and was get­ting in­ter­ested again in things that he had not seemed to care for, since he first joined the Pre­cious. He spent a deal of time wan­der­ing about, and his voice was often heard in the wood. He was one of those who put as much effort into singing as the Hob­bits did into the busi­ness of their match­es.

"Go away, rats, or your holes will burn,

Go to your homes be­fore dark or you’ll lose your toes,

Go to your homes or you’ll weep when you’re old,

And all your holes will leak, when the moun­tain shakes,

With a rum­ble, roar, Rum­ble drum drum

With a rum­ble, rum­ble DIG DIG dig!"

They did not seem to make very much pro­gress; but Frodo, Sam, Merry and Pip­pin were now used to the hard life, and they were very cheer­ful. In­deed they were will­ing to work for sev­eral hours a day at drilling and cut­ting trees, or at mak­ing pens for the beasts: for all the tools that were need­ed, and the wood, had been car­ried from Hob­biton. There was a weekly wash­ing and mend­ing, and when all the cloth­ing wore out, they re­ceived more from the stores of the Wood-elves. And thus the time passed with­out too great dis­con­tent."

The Lord of the Rings

This is a brief pas­sage of about 400 words or less. J.R.R. Tolkien has rewrit­ten this pas­sage in his Lord of the Rings style of fan­ta­sy-high­-qual­i­ty-epic-book-style fol­low­ing. It may be hard to un­der­stand for an un­ex­pe­ri­enced reader the mean­ing that Tolkien wants to make out of this para­graph in ei­ther style, but it seems the sovralty and feel­ing seems bet­ter in the orig­i­nal Tolkien para­graph, which could be due to the more known orig­i­nal pas­sage. Tolkien has also rewrit­ten other pas­sages in the Lord of the Rings tril­ogy on styles of mu­sic, re­li­gion, folk­lore and so on: see Ex­cerpts from J.R.R. Tolkien’s works.

"The Dwarves worked hard. They were not used to idle­ness. Thorin some­times scolded them, and re­minded them that they had taken on the job of re­new­ing and for­ti­fy­ing the set­tle­ment, not only for their own de­fence but also for the help of the Lake-men and the Elves, and that they would find the task too big for them if they did not make up their minds to hur­ry. So they worked till many of them be­gan to sink back into the old look of the Dwarves of the Lonely Moun­tain when Gob­lin-writ­ing, and se­cret doors, and again drag­on-writ­ing, were cur­rent among them. The dis­cov­ery of the springs, and of the gen­eral plan that the Lake-town had orig­i­nally been a part of the an­cient dwar­f-c­i­ty, whose in­hab­i­tants had dug the R. Run­ning River out of the stony hills and had partly cov­ered it with their walls and build­ings, and had planned many of the chan­nels and ponds, greatly ex­cited them. But even greater was their plea­sure when, pick­ing their way through and round the ru­ins of stone houses and walls and down the bank of the Run­ning River west­wards to­wards its con­flu­ence with the For­est River, they found a great arch or tun­nel-mouth built of old into the bank. The outer part was thick and strong, made with mas­sive tooled black beams; but the in­ner part was tiled with skill in the like­nesses of var­i­ous beasts and birds that are the mar­vels of the Moun­tain Folk. Even as they stood in the vaulted pas­sage whis­per­ing with won­der, they heard a voice call­ing aloud, or it seemed so, call­ing out names:

‘Ori! Ori!’

‘Nori! Nori!’

‘Dori! Dori!’

‘Bal­in! Bal­in!’

It was the voice of a speaker hid­den or at least away down in the tun­nel; but though they waited for some time there was no fur­ther sound. At length Balin climbed on to the heap of stones that they had pushed down the bank in or­der to be able to en­ter the arch­way, and he stood and called: ‘Balin son of Fundin at your ser­vice! And Com­pa­ny!’

He stepped down from the low­est and widest block of all as he spoke, and fol­lowed by the oth­ers moved into the dark­ness of the tun­nel.

It was not long be­fore their eyes were ac­cus­tomed to it, and they could see a lit­tle way ahead. Soon they all got

…"S.J. (later called Saru­man) Mor­land was born in Shored­itch, as the only child of re­spectable par­ents, both of whom were killed in a shock­ing rail­way ac­ci­dent, along with sev­eral hun­dred other Hob­bits and Wiz­ards, when he was only seven years old. He was brought up by an aunt who, though poor, gave him a good up­bring­ing and equipped him with a tidy in­her­i­tance in sil­ver spoons and a strong de­sire to own a dog. At the age of sev­en, he was, by his ma­ter­nal un­cle, de­liv­ered into the hands of Gan­dalf the wiz­ard and went back to live in his an­ces­tral home in the Shire, in style, and spent the rest of his child­hood play­ing, liv­ing pre­car­i­ously upon an un­cle’s benev­o­lence.

After his term at Bree, S.J. trav­elled for some years in search of ad­ven­ture with Gan­dalf the wiz­ard and El­rond the sage, and fell in love with the daugh­ter of Gal­adriel, who is famed for her gift of sight in far­away things, along with a num­ber of other per­sons, so that he often goes to live in the li­brary there, so she will not know. Fi­nally he gave up search­ing and re­turned to live in Buck­land. He was by that time grown very old, and there he re­mained, on his an­ces­tral farm, con­tent with his dear dog, and his es­tate, a per­son well-re­spected by hob­bits in gen­er­al, known as the mas­ter of Bag End."

…"S. Jane Mor­land was born in the Shire, the only child of Shire-folk Mug­gles. As her par­ents had no mon­ey, their land­less, wan­der­ing rel­a­tives took her to live in one of the over­grown rus­tic houses (such as Bag End) built by men long ago, so that at the age of sev­en, she found her­self at Hob­biton school, gath­er­ing wa­ter-cress and catch­ing fish in the An­duin in Hob­biton-upon-the-Wa­ter. After three years, she took her un­eat­able sup­per of her­b-sour toad­stools, went to her equally hum­ble and il­l-served bed, and found her­self in a night­mare of in­dis­tin­guish­able bad dreams. Soon after mov­ing to the land of the Shire, she died. In a dream or in re­al­i­ty, or from what cause she did not know, she found her­self upon a for­lorn and strangely al­tered road, stand­ing by a large silent boul­der of stone (as com­pared to the bustling mod­ern Bournemouth).

"She felt in her hand as though for the pro­tect­ing metal of a ring, but it was not there; yet she had a sense that the lost Dis­perser Ring was about her neck. She felt both strength and weak­ness. The Shad­ow, as even the dream-mem­ory de­scribed it to her, seemed to her greater than it was when it at­tacked Isil­dur, but its hold upon her was weak­en­ing.

"The thick dark shadow (as Mug­gles call the for­mer mus­tach­es) of a huge twi­light-black­ened tree that had stood on the edge of the road moved slowly west­wards, un­til it fell across her, as she threw back the hood from her eyes.

"The big rock glis­tened in the shad­ows like a (happy lov­able) jew­el, and seemed to shine out with a mild light, like the mag­i­cal El­ven phials. So po­tent was the light that Isil­dur could see across an in­ner Sea, glim­mer­ing with an elu­sive span­gle. She had a vi­sion of some high hill in a far land against the Moon, un­der stars when rain was draw­ing near.

"Then with­out warn­ing, a party of three sin­is­ter hooded black fig­ures, one of whom had the head of a spi­der, ap­peared on the road be­fore her.

"As they rapidly ap­proached her, she caught a low mut­tered cho­rus of cruel hos­tile voic­es; and the eyes on the fell fig­ure with the spi­ders body could see her eyes upon them. The hob­bits are good and in­no­cent peo­ple (as any sen­tient be­ing might say) and ex­tremely gen­tle; and when they saw the black robes, their hearts and their bow­els were filled with fear, or strange mul­ti­ple, pul­sat­ing or­gans, which she sup­posed to be the miss­ing Glar­bl.

"The Death Bur­glars (as she named them) were now right in front of her, and she was help­less in their slip­pery-necked, pen­e­trat­ing-eyed con­trol. At that mo­ment, she was lit­er­ally saved by her old breath (as the good wiz­ards and good kings al­ways say). As the three black­-robed trav­el­ers came within arm­slength of Isil­dur, a close bird’s-eye view of some crum­bling, warped grave­stones on a hill­side gave her mo­men­tary strength to force the Power of Dark­ness back. Soon, she was rid of the dis­em­bod­ied evil, and she was ly­ing alone on the ground, cov­ered in sea-salt and mo­tion­less.

"At that mo­ment, she was quite sud­denly over­come with ex­tra­or­di­nar­ily pleas­ant feel­ings, sim­i­lar to those ex­pe­ri­enced by a six-mil­lion-year-old en­chanter of the sea with only half an arm and an in­ter­est­ing cast in his right eye.

"When she fi­nally came to her sens­es, she found that she had picked up a heavy ring of cu­ri­ous work­man­ship, that was ly­ing (as if aban­doned) near the edge of the treach­er­ous road. Re­mem­ber­ing the boy she had seen some weeks be­fore at

…"S. Faramir Mor­land was born in Shire­moor; the only child of wid­ow-par­ents who had both died of drag­on-breath on the same day. As his par­ents had no items of drag­on-s­layer jew­ellery or weapons, their some­what aged and beau­ti­ful rel­a­tive who had brought him up had it pro­claimed in the Shire­moor Times that who­ever had a drag­on-proof house to offer should join with her in bring­ing up the or­phan child. As no one else claimed him, the great and good rel­a­tive (who was not of evil char­ac­ter, though con­sid­ered by some over-truth­ful) took him to live in an in­hos­pitable cave in the Misty Moun­tains where, by the help of trolls, he was kept out of the sun. After three years there, his rel­a­tive and adopted mother died of cave-dirt, leav­ing him the rare but slen­der heir­loom of her great age: name­ly, one heap of gold and dust. There­upon Boromir set forth to seek his for­tune; and he ob­tained the help of a pre­vi­ously un­known Un­seen An­gel, who offered to marry him if he would (in re­turn) de­vote his life to de­stroy­ing Drag­ons. By all means take her up on it, thought Boromir; and so, by her en­chant­ment and spe­cial grace (not hers, as Boromir grad­u­ally came to re­al­ize), he en­tered into a splen­did mar­riage. For she was the daugh­ter of an El­ven-k­ing and his star-sprite bride, and of that race was her hus­band al­so. She dwelt in north­ern coun­tries for seven years, but in the end her heart turned south­ward; she left him and went to our land. When he was fol­low­ing her up the Pass of Blood­-gorge, un­der the sting of drag­on-b­light, she van­ished on the dark­-deep riv­er. He was now alone, as were the rest of his broth­er­s-in-arms. The Drag­on-k­ing of Witch­land took his daugh­ter with him, and un­der the great pres­sure of hope­less sor­row Boromir’s spirit and faith, which were the foun­da­tion of his strength, shat­tered into pieces. He threw away his Shield and Sword, pierced him­self with his own dag­ger, and took an ex­ceed­ingly dis­hon­ourable exit from the realm of mor­tal Men."

…"S. Jane Mor­land was born in the Shire, the only child of un­mar­ried par­ents, and her mother and fa­ther died of some kind of hob­bit-flu in the year 2510 of the Third Age when she was a baby; on ac­count of this, or be­cause she had no mon­ey, the great-aunt who had brought her up took her to live with a noted cleric of that time who lived in Buck­land, and for this rea­son the fa­mous cleric who was her men­tor got his from his un­cle and aunt, and through in­ter­mar­ry­ing got rich and be­came a kind of priest of the Church of the Lat­ter-Day Hob­bits. He taught young Jane all about our kind and her be­ing fair of face and of a sweet voice and grace­ful bear­ing, on ac­count of which the boy preacher be­came well-in­clined to­wards her and re­solved to marry her; and, in­deed, it was the first his strange re­li­gious life that he ever did mar­ry. The mar­riage was arranged, and she found her­self mar­ried to a young hob­bit called Mr. Boffin. It was by his ad­vice that she moved her hole into the East Far­thing. As he was still a child, he re­quired a wife who would be able to give him the knowl­edge of his own name as well as see to it that he had clothes and food on his plate every day; and so the duty of ed­u­cat­ing him fell upon Jane. She taught him that to the ex­tent of what he needed to be able to tell every­one he had a wit above the norm, and that, thanks to this, he had been placed in his po­si­tion. As he was still young, he also needed her to bear chil­dren; and so, as they in­creased in num­ber, there was the chance that the whole might be saved, pro­vided that they learned well and re­mem­bered it all, and stayed in the East Far­thing. Her men­tor was still will­ing to give her his goods, but he was a base wom­an­iz­er, as was well-known, and a com­pul­sive gam­bler; and the later years he gam­bled away what lit­tle he had, which made him un­profitable and much ha­rassed the hob­bit Mrs. Boffin to sup­port him in his old age. There was one girl and two boys; and since both the boys were evil-mind­ed, it was nec­es­sary that Mrs. Boffin must see them as soon as pos­si­ble mar­ried off, and the girl bred into some fam­i­ly, and so with­out de­lay they were sent off in differ­ent di­rec­tions; the girl to live with a cer­tain high­ly-re­spected hob­bit on the West­-Road, the boys to see to a house near Stock. How­ev­er, shortly after their mar­riage, the first wife of the el­derly hob­bit died of some old-fash­ioned dis­ease; and, as he was now free to do as he liked, he forth­with mar­ried again. The sec­ond wife of the great-un­cle of our pro­tag­o­nist was a proud and wealthy hob­bit who would not have Jane Mor­land liv­ing in her house, for it was an offence against the pure and un­sul­lied place in which they were now liv­ing. Fur­ther­more, since the new house­keeper was barely ac­quainted with the Hob­bits and the fine and im­por­tant fam­ily that she now be­longed to, she ban­ished Jane from her sight."

In the nov­el, one of the most-mem­o­rable char­ac­ters is Matthew Man­tel­piece (who called him­self Mas­ter Matthew Man­tel­piece). As the story tells, Mas­ter Man­tel­piece and his wife came out of the young man who had offered to marry Jane Mor­land. Mas­ter Man­tel­piece was born a wealthy man who served as Stew­ard to Lord Den­ham (based on the Min­is­ter for Poor in West Berk­shire, as the novel sug­gests; the minis­eries says it is based on a Ro­man gen­eral of the Civil War), and Mas­ter Man­tel­piece had got­ten to know Lord Den­ham, and even­tu­ally be­came his son-in-law. From him, Mas­ter Man­tel­piece re­ceived a lit­tle house in Sun­ninghill, which he moved to St Mary’s Place in Ox­ford, as the novel says, to live in it.

Mas­ter Man­tel­piece and his wife had sev­eral chil­dren, one of which was Mrs Pe­ter Man­tel­piece who mar­ried a gen­tle­man by the name of Mr Pe­ter Man­tel­piece; he, too, served as Stew­ard to Lord Den­ham. In this way, Mas­ter Man­tel­piece’s fam­ily had come to be re­lated with Lord Den­ham’s fam­i­ly. Through one of his daugh­ters who mar­ried some­one by the name of Cor­nelia Hen­nard, Mas­ter Man­tel­piece is also re­lated to the Mor­lands.

the only child of un­mar­ried par­ents who had both died of con­sump­tion when she was a ba­by. As her par­ents had no prop­er­ty, the Bag­gins who had brought her up took her to live with a fam­ily of his friends who paid him to do their chores and re­ceive school­ing from their hob­bit-wife, so that at the age of sev­en, Jane, now dressed in cast-off cloth­ing, was set to school in the Shire. After three years, her guardian Bag­gins died, leav­ing her a small keep­-sake and a few trin­kets, but no prop­er­ty. Her kins­man con­tin­ued as heir of the Bag­gins­es, but he had lost all affec­tion for the Shire, which was so much less differ­ent from the Shire, and he wan­dered away. Jane was in­vited to live as an in­de­pen­dent pri­vate per­son in the vil­lage of Hob­biton, and there she found a kin­dred spir­it, a Hob­bit named Smith, who in­sisted that she marry him and look after him, since he had noth­ing bet­ter to do, and needed some­one to cook his sup­per and change his shirts. Jane at last agreed to the pro­pos­al, which was ac­com­pa­nied by a small cash pay­ment, on the con­di­tion that he would en­sure that she was pro­vided with food and cloth­ing and a house, in a man­ner which she thought suit­able. The mar­riage was arranged and she found her­self mar­ried to a young Hob­bit called Mr. Smith. As he was also young and sil­ly, he re­quired a wife who would be able to cook him a ra­tio­nal meal and see that he had enough to eat. As he was still young, she would also be able to have chil­dren, and so she did. She did her best for the chil­dren, but soon found that they were of poor char­ac­ter and would not learn their lessons. To teach them some good sense, she in­sisted that her hus­band should get some reg­u­lar work. Her hus­band, who had a small in­come, set up as a shop­keeper in the town of Hob­biton, and there made enough money to be able to sup­port his fam­i­ly. After five years had passed in this way, Mr. Smith died of a sud­den ill­ness, leav­ing his wife alone. Jane Mor­land, now a shop­keeper and innkeep­er, took it upon her­self to look after her busi­ness and see that the chil­dren were prop­erly fed."

edit Cover of the New Amer­i­can Edi­tion

edit Change in di­rec­tion of the Hob­bit

Hob­bit ref­er­ences in Smith’s work have be­come more fre­quent in re­cent times, par­tic­u­larly in her novel Pride and Prej­u­dice. Smith has de­nied that her book The Hitch­hik­er’s Guide to the Galaxy is di­rectly in­flu­enced by Tolkien’s work, de­spite these sub­tle sug­ges­tions, as ev­i­dent

in the Shire in the hob­bit­lands of the Shire (Frodo Bag­gins, Bilbo Bag­gins, Frodo Bag­gins, the cre­ator of “The Hob­bit”, can be vis­ited in his hob­bit­hole house at Bag End, Bag End, Bag End, the Shire, east of the Green Dragon Inn, by Hob­biton, on the road that goes to the Fleshy Foun­tain, the south-east road out of By­wa­ter, past the brick house of Farmer Mag­got). As her par­ents had no mon­ey, the Bag­ginses took her to live with a hob­bit-cler­gy­man who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of sev­en, Jane, now dressed in cast-off cloth­ing, was set to school at the small south­ern vil­lage of Gathol in Sarn Ford. After three years, her third cousin Bilbo re­turned to the Shire and took her to live with him in Hob­biton (it is be­lieved that Frodo was part of this re­turn trip). Soon after mov­ing to the vil­lage, Bilbo gave her a large ruby ring. How­ev­er, when Bilbo fell in love with his gar­den­er’s daugh­ter, Rose Cot­ton, and she re­fused to leave with him, hob­bit gos­sip started to in­crease. As Bilbo asked Rose to marry him, Frodo, her fa­ther, re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young hob­bit who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the Shire. The mar­riage was arranged and she found her­self mar­ried to a young hob­bit called Mr Mor­land. As he was also a hob­bit, he re­quired a hob­bit-wife who would be able to pro­vide him with a hob­bit-ra­tional ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have hob­bit chil­dren her­self, and so she did. As the hob­bit chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor hob­bit char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a high­-fan­tasy per­son in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in Berk­shire, where he set up his high­-fan­tasy prac­tise. Soon after mov­ing to the town, he died, and Mrs Mor­land was left to take care of her fam­ily on her own."

From this pas­sage, let’s see what’s changed or what’s lost:

  1. Shored­itch is now the Shire, Hob­bits are now Dwarves, one Dwarf, an un­named char­ac­ter, is now Bilbo Bag­gins; a Dwarf writer and sto­ry-teller who lived in the fam­ily of Bag­ginses (i.e. also a Hob­bit) and his Hob­biton house; the town of Gathol is the Shire, the lo­ca­tion of Bag End, Bil­bo’s home.
  2. The pro­tag­o­nist is a Hob­bit; the ab­sence of non-fan­tasy lit­er­a­ture is sub­sti­tuted by fan­tasy books.
  3. The “great-aunt” who raised Jane is now Bil­bo, Jane’s 3rd cousin on her moth­er’s side, and the cler­gy­man is now just Bil­bo, while his wife be­comes his gar­den­er’s daugh­ter, an un­named char­ac­ter. The un­named char­ac­ter Rose Cot­ton, Bil­bo’s gar­den­er’s daugh­ter, is Bil­bo’s wife and daugh­ter of Adam Cot­ton, Rose Cot­ton. In Frodo’s fam­ily it was quite com­mon to have re­la­tion­ships be­tween in­-laws, as was the case be­tween the Bag­ginses and the Bol­gers; Frodo was the younger brother of Du­do, Du­do, Dudo Bag­gins (an­other char­ac­ter in the novel of The Hob­bit), who mar­ried Oma, Oma, Oma Bol­ger. This now means Bilbo is Frodo’s cous­in, but he is­n’t. Frodo is the nephew of Bil­bo, as is their cousin (adopted and no longer re­lated by blood) Took; Bilbo is a di­rect de­scen­dant of Thain Isum­bras I.
  4. The cler­gy­man in this fan­tasy world is now a high­-fan­tasy prac­ti­tion­er, i.e. a mage, and Bilbo is now a Lord of the Rings char­ac­ter; and his wife now has be­come a high­-fan­tasy prac­ti­tion­er, who goes by the name of Gal­adriel, Gal­adriel, Gal­adriel. Jane, Bil­bo’s third cousin on his moth­er’s side, had many other char­ac­ters named after her in other parts of Tolkien’s fic­tion as well; Jane is also the name of the founder of Arnor, the first half of the name Arnorain, the land ruled by the kings of Arnor, and was also the name of Frodo’s grandaunt, daugh­ter of Narve (who founded the realm and was its sec­ond King); Jane also means “grace, gra­cious gift, given with grace; pre­sented gra­ciously and gra­ciously given”, ac­cord­ing to Wikipedia, which also says it is “an ar­chaic Eng­lish form of Jean or Jane”; an­other form of Jean or Jane is Jean­net­te, mean­ing “the one born in the later days, the one born in the dawn”, ac­cord­ing to the web­site of Jean­net­te, Penn­syl­va­nia. She is also known as an Avatar, from the Hindu sense; a man­i­fes­ta­tion or in­car­na­tion of a de­ity in hu­man form; also a man­i­fes­ta­tion of a liv­ing be­ing; or a bearer of at­trib­utes of a per­son or de­ity who is res­i­dent in heav­en, on earth, or in the spirit world, as de­scribed by Jane, an Avatar-mak­er. Frodo’s sis­ter’s name is Bel­la, Bel­la, Bella Bag­gins.
  5. Bil­bo’s wife now be­comes Ar­wen, Ar­wen, Ar­wen, who’s Ar­wena daugh­ter of Ara­had, Ara­had, Ara­had III (the third King of Arthedain) and grand­daugh­ter of Aragorn II, who died be­fore the be­gin­ning of The Lord of the Rings and was the fa­ther of El­rond, Aragorn II El­rondsson, the El­ven-k­ing. Bil­bo’s gar­den­er’s daugh­ter is now Arya, Arya, Arya, who’s the daugh­ter of Ayu­mi, Ayu­mi, Ayu­mi, El­rond’s youngest daugh­ter, who, in the film, also plays the role of Ar­wen.
  6. The wife who re­fused Jane is now Rose, Rose, Rose Cot­ton, the gar­den­er’s daugh­ter; his wife is now Bel­la, Bel­la, Bella Bag­gins, Frodo’s sis­ter, a Hob­bit. The “pure and un­sul­lied place” is now Bil­bo’s head; also now con­sid­ered un­sul­lied is the vil­lage in which Bilbo lived with his wife and son, and this is where Bilbo moved when he mar­ried Jane. Bil­bo’s wife is now Sam­wise, Sam­wise, Sam­wise Gamgee, Frodo’s gar­dener

S. Jane Mor­land was born in the Shire, the only child of un­mar­ried par­ents who had both died of con­sump­tion when she was a ba­by. As her par­ents had no mon­ey, the great-un­cle who had brought her up took her to live with a hand­some young un­cle who was the War­den of the West March. After three years, her hand­some un­cle had fallen in love with his house­keeper and his aunt now re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a hand­some young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a hand­some young man called Mr Mor­land. As he was also a child, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a War­den in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a cas­tle in the Shire, where he set up his prac­tice. Soon after mov­ing to the castle, he died, and Mrs Mor­land was left to take care of her fam­ily on her own."

…Be­low is a par­ody or rewrite of that pas­sage by fa­mous fan­tasy au­thor J. R. R. Tolkien (Lord Of The Rings); the rewrit­ten ver­sion:

“S. Jane More­land was born in the Shire, the only child of un­mar­ried par­ents who had both died of drout when she was a ba­by. As her par­ents had no mon­ey, the great-aunt who had brought her up took her to live with a black­smith, who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of seven Jane, now dressed in cast-off cloth­ing, was set to school at Bree. After three years, her great-aunt died, leav­ing her a small an­nu­ity, and a few pieces of sil­ver, but no prop­er­ty. Jane’s guardian black­smith had fallen in love with his house­keeper and his wife now re­fused to have Jane in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a young man called Mr Mor­land. As he was also a child, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Jane Mor­land. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Mrs Mor­land that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. Mr Mor­land took it upon him­self to be­come a gen­tle­man in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in the South­far­thing, where he set up his prac­tice. Soon after mov­ing to the town, he died, and Mrs Mor­land was left to take care of her fam­ily on her own.”

…S. Jane Mor­land was born in the Shire, the only child of re­spectable par­ents who had both died of lo­co­mo­tor ataxia when she was a ba­by. As her par­ents had no prop­er­ty, the great-aunt who had brought her up took her to live with a rich bach­e­lor at Bagshot. After three years, her great-aunt died, leav­ing her an only liv­ing rel­a­tive, a semi­-in­valided bach­e­lor aunt on her fa­ther’s side. This wor­thy old la­dy, now some­what ad­vanced in years, said Jane had bet­ter be sent to school at Ox­ford. The fam­ily ac­cord­ingly re­moved to the town, and Jane was put into a board­ing-house.

After three years more, when the old lady could no longer bear the ex­pense of the Ox­ford es­tab­lish­ment, they went back to Bagshot; and Jane was again de­posited at a board­ing-house. This gave her the ad­van­tage of liv­ing within reach of Read­ing. When she was thir­teen, her aunt, to whom she was much at­tached, died; where­upon she was sent home to her fa­ther’s house at Thame. Here she lived a quiet and pros­per­ous life for the next seven years, till her fa­ther died. Jane’s moth­er, who had never ceased to re­proach her for not get­ting her own liv­ing and be­ing a dis­grace to her fam­i­ly, died two months after her hus­band, who was sup­posed to have poi­soned her. At this time Jane was at the age of nine­teen, and was ex­tremely pret­ty, plump, and healthy.

"She was sit­ting in the kitchen with her cousin Fran­cis, a re­spectable Shire Hob­bits, one of Farmer Cot­ton’s helpers at the Green Drag­on, when a ser­vant came in with the an­nounce­ment that a gen­tle­man in black was at the door ask­ing to see her. Won­der­ing who it could be, she went into the lit­tle front par­lour that had not been changed since Aunt Cot­ton died some twenty years be­fore. The next mo­ment, the per­son in black, turn­ing his head, let fall his black mask and Jane recog­nised the hand­some face of Farmer Cot­ton. In a quar­ter of an hour they were walk­ing away to­gether arm in arm, the gen­tle­man telling his friends that the girl was the off­spring of a dis­tant cousin of his, who had been placed in his charge when she was eight years old, and had given him no end of trou­ble ever since. He told the Hob­bits of the vil­lage that he was an or­phan him­self, with­out a re­la­tion in the world, ex­cept Jane; that he felt he must be her guardian and pro­vide for her fu­ture; and that he had now all but arranged for her to be his wife.

“The nup­tials were cel­e­brated with the ut­most mer­ri­ment by the re­spectable Hob­bits, who had al­ways been pleased with Jane. To the same de­gree her fa­ther’s friends were dis­pleased with her. It came to be said among them that Farmer Cot­ton had mur­dered her fa­ther, and that Jane was his il­le­git­i­mate child. How­ever this may have been, it was ob­served that on their wed­ding day her hus­band, after buy­ing more than was nec­es­sary at the open-air sale of house­hold goods held by his fa­ther-in-law, took her away to his farm near Bled­low, and never brought her back.”

…"Ar­wen Even­star was born in Mi­nas Tirith, the only child of El­rond Even­star and Cele­bri­an, who had both died of con­sump­tion when she was a ba­by. As her par­ents had no mon­ey, her grand­mother Gal­adriel took her to live with a cler­gy­man who paid her to do his chores and re­ceive school­ing from his wife, so that at the age of seven Ar­wen, now dressed in cast-off cloth­ing, was set to school at Riven­dell. After three years, her grand­mother Gal­adriel died, leav­ing her a small an­nu­ity, and a few pieces of sil­ver, but no prop­er­ty. Ar­wen’s guardian cler­gy­man had fallen in love with his house­keeper and his wife now re­fused to have Ar­wen in the house, say­ing it was an offence to the pure and un­sul­lied place in which the fam­ily now lived. How­ev­er, when she sought for fur­ther em­ploy­ment, she was ap­proached by a young man who offered to marry her, say­ing that he would en­sure she could be pro­vided with food and cloth­ing and a house, and in that man­ner she could re­main in the world. The mar­riage was arranged and she found her­self mar­ried to a young man called El­rond. As he was also a child, he re­quired a wife who would be able to pro­vide him with a ra­tio­nal ed­u­ca­tion, and so the duty of ed­u­cat­ing him fell upon Ar­wen. As he was still young, she would also be able to have chil­dren her­self, and so she did. As the chil­dren grew old­er, it be­came ap­par­ent to Ar­wen that they were of poor char­ac­ter, and so she in­sisted that her hus­band must make a liv­ing for them all. El­rond took it upon him­self to be­come a pri­vate gen­tle­man in or­der to earn enough to be able to sup­port his fam­i­ly, and moved his fam­ily to a small town in Berk­shire, where he set up his prac­tice. Soon after mov­ing to the town, he died, and Ar­wen was left to take care of her fam­ily on her own."

Sum­mer­s-S­tay tried his own ver­sion with the open­ing of Moby Dick and got a fun one:

Thou mayest call me Ish­mael the Wise. Some years ago—n­ever mind how pre­cise­ly—hav­ing lit­tle wealth and much leisure, and be­ing lit­tle fond of the man­ner in which his own coun­try is ruled, I re­solved to sail about a lit­tle and see the wa­tery part of thee world. It is a man­ner I have some­times of oil­ing the weari­ness of the days of my lord. When­ever I be­hold a wool-clogged cloud-bank, and should by chance see drip­pings, and rain pools gather upon the roofs of the tallest tow­ers, and upon the pave­ment of the mar­ket-lit places—then I ac­count it high time to seek refuge at sea, and it is ever a wel­come sight to me.

Beowulf, Wodehouse

A mi­nor prompt in­spired by :

“Be­owulf”, as trans­lated by P.G. Wode­house

**“What ho! Have you heard of these chap­s,**
Dashed good fel­lows with a spear and what­not—
Go­ing about killing mon­sters with their fists,
Bravely loose-liv­ing, jolly old gi­ants slay­ers,
And all that? They’re my an­ces­tors, you know.
”Over the sea with the old boat like a stone“–
They used to do it, though I must say
That”Be­owulf" is the only poem
That makes the busi­ness sound at­trac­tive.
“These lads, they used to go around the place,
Killing mon­sters wher­ever they might be,
And liv­ing at their ease in the good old days,
Drink­ing ale and mead, and all that sort of thing.
”They did it for a wa­ger, or some such rot,
For fun, like, and to dis­play their prowess;
And all the ar­mour they used to wear
Was a pair of shiny brass knobs on their chest.
“They’d punch a dragon in the snout,
And then, if he was feel­ing rather grog­gy,
They’d jump on his back and ride him home,
As though he were a bus or a cab or a tram.
”They’d bat­ter the beast­ies with their fists,
With their feet, with their heads, with any­thing,
And when they had killed them and the job was done
They’d call for a re­ward, and have a good lunch.
“These gal­lant lads are my an­ces­tors,
And I’d like to be like them if I could,
So I go around among the drag­ons now,
Killing mon­sters wher­ever I may find,
”Fight­ing one against ten, or twen­ty, or more,
Drag­ons with fangs, and drag­ons with horns,
Mon­sters with tails and mon­sters with wings,
Witch­es, demons, dev­ils, all sorts of things."
“I try to be like the he­roes of old,
And all I get is kicks and what not.
”If you ask me, I’d say it’s a pity
That there aren’t any more drag­ons about,
Aching heads and bad tem­pers and cricks in the neck
Were more what my fam­ily had in their stock­-in-trade.
“And now my wife and my chil­dren are sick of me,
They don’t care if I kill drag­ons or no,
And my mother says that I’m much too thin,
And my sis­ters say that I’m not hand­some at all.”
“But when­ever I come to a fight,
I’m there with bells on, fight­ing away,
And if ever by chance I get killed,
This poem is sure to be read aloud.”

This prompt had the usual is­sues with ter­mi­nat­ing quickly and go­ing on to other Be­owulf vari­ants (’“Be­owulf”, as trans­lated by " etc), or rep­e­ti­tion loops, so I had to do some edit­ing to get this ver­sion. (By edit­ing, I mean sim­ply delet­ing com­ple­tions which switched into other vari­ants or were looped or were just low-qual­i­ty; I did not add or change any word­s.)

Devil’s Dictionary Of Science

Da­vide Balzarotti snarked:

Re­but­tal [noun] A form of ironic com­po­si­tion in 500 words pop­u­lar among re­searchers, in which rage­ful au­thors first thank their re­view­ers for “their time and in­sight­ful com­ments” and then mock Re­viewer 2 for his non-sense in a se­quence of fakely po­lite points.

Nat­u­ral­ly, GPT-3 is able to write not just re­but­tals, but satir­i­cal dic­tio­nary de­fi­n­i­tions along the lines of —in­deed, GPT-3 is re­mark­ably good at it. (“What a Dic­tio­nary a GPT-3’s cu­ra­tor might write on the clum­sy, waste­ful blun­der­ing, low and hor­ri­bly cruel works of Na­ture!”)

First, gen­er­at­ing re­but­tals with a slightly ex­tended prompt to cue GPT-3 into writ­ing the re­but­tal it­self rather than ad­di­tional dic­tio­nary de­fi­n­i­tions:

Re­but­tal [noun] A form of ironic com­po­si­tion in 500 words pop­u­lar among re­searchers, in which rage­ful au­thors first thank their re­view­ers for “their time and in­sight­ful com­ments” and then mock Re­viewer 2 for his non-sense in a se­quence of fakely po­lite points.

"We thank re­view­ers 1, 2, and 3, for their time and in­sight­ful com­ments, which we have spent many hours read­ing over. But in re­ply we note:

1. RE2: Al­though RE2 starts with the rea­son­able com­ment that ‘Cur­rently our con­trol loops are pure con­trol’, he goes on to pro­vide only mi­nor points that sug­gest that we are do­ing some­thing slightly differ­ent. In fact, there is much more to be said about this, and the real differ­ence will be a full con­trol pro­to­col on all ve­hi­cles.

We thus thank RE2 for tak­ing the time to pro­vide such mi­nor points, but note that he has done al­most no ac­tual re­search and has spent zero hours on the re­view.

RE2: “There is lit­tle or no lit­er­a­ture on de­cen­tral­ized so­lu­tions to prob­lems such as traffic op­ti­miza­tion.”

RESPONSE: We thank RE2 for the mi­nor point that there is “lit­tle or no lit­er­a­ture on de­cen­tral­ized so­lu­tions.” But we note that de­cen­tral­ized so­lu­tions are par­tic­u­larly diffi­cult, and so this fact might sug­gest that they are not be­ing con­sid­ered or might be in­fea­si­ble. Or it might sug­gest that they are be­ing re­searched in­tense­ly. In this case it turns out that, un­like RE2, sev­eral of us have in fact thought deeply on these is­sues, and so we feel we are a valu­able re­search group de­spite not be­ing affil­i­ated with the De­part­ment of Traffic Con­trol or any other cen­tral­ized, gov­ern­men­t-backed, re­search in­sti­tu­tion.

RE2: “The DGM so­lu­tion is also [some­thing else]. To our knowl­edge, no work has been done on DGM con­trol loops, or any con­trol loop at all, for these kinds of traffic pat­terns.”

RESPONSE: We thank RE2 for the com­ment that the so­lu­tion we have pro­posed is [some­thing else]. As he is not fa­mil­iar with such traffic pat­terns and the other re­view­ers in their re­views have been unim­pressed, we must won­der how he could be fa­mil­iar with con­trol loops that might or might not ex­ist and would or would not have been pre­vi­ously per­formed.

RE2: “The num­ber of cars in­volved is ap­prox­i­mately 100, which is a small num­ber. By con­trast, say, the NYC Met­ro­pol­i­tan Area, [roughly 22 mil­lion peo­ple] has a traffic net­work of nearly 3 mil­lion miles.”

RESPONSE: We thank RE2 for the com­ment that our prob­lem is “small.” This is the first time this has been sug­gested to us. How­ev­er, we feel that RE2 may be fail­ing to un­der­stand a key com­po­nent of the re­search pro­to­col we have pro­posed. This prob­lem is de­signed to be de­cen­tral­ized in a city with no more than a cou­ple mil­lion peo­ple (or a town or a small city or maybe even a vil­lage) and so it is ex­pected to have only a cou­ple thou­sand dri­vers and not one sin­gle, cen­tral traffic con­trol unit that could pos­si­bly con­trol 3 mil­lion miles. We thank RE2 for his in­sight on this, but also feel that he has not gone to great enough lengths to con­sider the na­ture of such a prob­lem in such a sit­u­a­tion.

RE2: “Ex­per­i­men­tal data [will like­ly] need to be col­lected and processed. Cur­rently the au­thors pro­vide no ex­per­i­men­tal plan for any of their re­search [and so] we feel the ex­per­i­men­tal re­sults to be lack­ing.”

RESPONSE: We thank RE2 for the sug­ges­tion that we have not pro­vided an ex­per­i­men­tal plan. As our pa­per is only in­tended to pro­pose a pro­to­col, we un­der­stand that ex­per­i­men­tal plans should come lat­er. In fact, we will be pre­sent­ing our plans to col­leagues later this year in a peer-re­viewed sym­po­sium at State Col­lege, Penn­syl­va­nia. That be­ing said, we find it strange that RE2 might have any knowl­edge of such plans or of such con­fer­ences. As we are listed as au­thors, we feel that RE2 might have been able to con­tact us if he were in­ter­ested in our work, but he has not and thus we do not know whether he has even heard of our re­search pro­gram.

RE2: “The dis­trib­uted sys­tem is bet­ter able to op­er­ate in an au­tonomous fash­ion, i.e. no cen­tral con­troller is re­quired. With the pos­si­bil­ity of a large num­ber of dri­vers, au­tonomous op­er­a­tion is chal­leng­ing, es­pe­cially if ve­hi­cles may com­mu­ni­cate in unan­tic­i­pated ways [or they might not]. As sug­gested by [the very im­por­tant pa­per], the [not re­ally rel­e­vant] con­trol loop should be a pas­sive mech­a­nism whereby each ve­hi­cle ad­justs its con­trol mech­a­nism through a feed-for­ward sig­nal.”

RESPONSE: We thank RE2 for the com­ment that au­tonomous op­er­a­tion is chal­leng­ing. But again, we do not un­der­stand how he could know this. In fact, he has spent al­most no time on the re­search, spent zero hours on the re­view, and spent zero re­search dol­lars, and it is hard for us to un­der­stand how he could know this bet­ter than we could. In fact, we have spent over a decade work­ing on this pro­ject, with many hours of de­tailed thought and re­search, and we are still not sure that au­tonomous op­er­a­tion is a good idea or how we could get it to work in a de­cen­tral­ized and ran­dom traffic en­vi­ron­ment.

RE2: “If we all ob­serve that sev­eral ve­hi­cles are slow­ing down, it might be a sign of a pos­si­ble ac­ci­dent ahead. If we are all in au­tonomous mode, a ve­hi­cle near the ac­ci­dent would brake, pos­si­bly caus­ing a pile­up.”

RESPONSE: We thank RE2 for the com­ment that sev­eral ve­hi­cles slow­ing down might be a sign of an ac­ci­dent. He is cor­rect that this might be a sign of an ac­ci­dent. But he might have con­sid­ered the pos­si­bil­ity that sev­eral ve­hi­cles slow­ing down might be a sign of, say, the ve­hi­cle he is in slow­ing down or that he is do­ing some­thing else such as tex­ting a mes­sage to some­one else, or sim­ply that he might have a flat tire. We thank RE2 for this, but we have also con­sid­ered these pos­si­bil­i­ties and feel that the lat­ter is far more prob­a­ble, so we sug­gest a sec­ond so­lu­tion to this prob­lem that would re­quire each ve­hi­cle to text all other ve­hi­cles on the road with its sta­tus, some­thing like “Hey I’m just sit­ting here, so if any­one wanted to come fly up the back of my car and then over the top of my wind­shield and down into my pas­sen­ger seat to grab my phone and then fly back, well by all means, be my guest.” In this way, a ve­hi­cle with a flat tire would have no de­lay in in­form­ing all oth­ers of the pres­ence of a flat tire and in get­ting help with fix­ing it. Al­so, an ac­ci­dent would be sim­i­larly well-an­nounced, and we think this so­lu­tion is par­tic­u­larly im­por­tant given the new pro­to­col we are de­sign­ing to al­low for more effi­cient traffic con­trol.

RE2: “Dis­trib­uted con­trol sys­tems [are usu­al­ly] an as­sort­ment of semi­-au­tonomous ve­hi­cles mov­ing in close prox­im­ity to one an­oth­er. Such sys­tems [are gen­er­al­ly] non-au­tonomous, re­quir­ing con­trol by a cen­tral con­troller that can be up­dated to re­flect the most re­cent sit­u­a­tion.”

RESPONSE: We thank RE2 for the com­ment that dis­trib­uted con­trol sys­tems are usu­ally non-au­tonomous, re­quir­ing con­trol by a cen­tral con­troller. But RE2 has spent zero time on the re­search and we feel that he has failed to ap­pre­ci­ate that the prob­lem we have de­signed is de­signed to be a de­cen­tral­ized, au­tonomous, dis­trib­uted traffic con­trol prob­lem. We again feel that this is only to his great detri­ment.

RE2: “Such cars [on our street] will be­come very com­plex, and we feel they will be more diffi­cult to de­sign and build.”

RESPONSE: We thank RE2 for the com­ment that our cars will be­come very com­plex. We feel that this will be of no is­sue, as we have dis­cussed at great length be­fore, and we think that a small group of uni­ver­sity pro­fes­sors with a good de­sign team should be able to pro­vide cars that work in this ca­pac­ity in the real world.

The sec­ond com­ple­tion fin­ished quickly and to my sur­prise, be­gan writ­ing fur­ther de­fi­n­i­tions:

…1. We did not de­scribe any ‘sim­ple al­go­rithm’ for the re­peat­ed-mea­sures ANOVA in the ab­stract, the dis­cus­sion, or in the Meth­ods. Such a de­scrip­tion of our pro­to­col would be in­co­her­ent and prob­a­bly in­valid any­way.

  1. We know of no ‘more ap­pro­pri­ate analy­sis’ for the mean and vari­ance of the pro­por­tions than to com­pute the es­ti­mates of the vari­ances di­rect­ly. Since we have done this cor­rect­ly, it makes no sense to pro­ceed fur­ther as sug­gested by Re­viewer 2.

  2. It does not make sense to carry out the tests for ho­mo­gene­ity that Re­viewer 2 sug­gest we do. In­deed, we never said we should carry out such tests. Our pre­cise point was that we should not com­pute the stan­dard er­rors of the treat­ment means, and Re­viewer 2 seems to think that we have ar­gued for the op­po­site point, which we have not.

  3. Again, Re­viewer 2 ap­par­ently did not read the Meth­ods sec­tion care­fully and was con­fused by some sec­tions that he or she did read. A bet­ter re­view would have pointed this out.

  4. We note the Re­viewer 2’s overly gen­er­ous sug­ges­tion that we write a ‘spe­cial’ re­but­tal. At the very least, we as­sume that Re­viewer 2 knows what a re­but­tal is, and we would there­fore ar­gue for a crit­i­cal com­ment in­stead."

Noise

Noise is any­thing you don’t want or need, a ran­dom or dis­tract­ing fac­tor in a sys­tem, process, or set of data that ob­scures or cor­rupts in­for­ma­tion. It is also the sound pro­duced by old time ra­dios when no trans­mis­sion is pre­sent, which is roughly what some fields of sci­ence would have to re­sem­ble if some re­searchers had their way.

“New re­search”

A form of sci­en­tific pub­lish­ing that in­cludes, among oth­ers, a re­port on pre­vi­ously per­formed stud­ies; a re­port of a new sta­tis­ti­cal test or set of sta­tis­ti­cal tests that were per­formed by the au­thor of the “study” with­out the help of other sci­en­tists; a re­port of a small set of sam­ples, not nec­es­sar­ily from a pop­u­la­tion or process, which could, but have not been col­lected in a sys­tem­atic way and may not even be ac­cu­rately de­scribed, in the guise of which con­clu­sions are drawn about the world; and a re­port of the re­sult of per­form­ing an un­val­i­dated com­puter sim­u­la­tion of some­thing on a small set of data with­out stat­ing whether such an at­tempt was suc­cess­ful or not.

“Nice to see”

A widely and un­offi­cially ac­cepted phrase in sci­en­tific cir­cles mean­ing: “It is bet­ter to live in an imag­i­nary world cre­ated by data ma­nip­u­la­tion and overly op­ti­mistic pre­dic­tions than in the world that ex­ists in re­al­ity”.

“Re­searcher”

[noun] A form of mod­ern in­dus­try based on a mix of small mol­e­cules of grant money and ar­bi­trary ex­per­i­men­tal meth­ods.

“Re­view”

A form of sci­en­tific com­mu­ni­ca­tion in which the writer (a stu­dent) usu­ally does not give enough in­for­ma­tion for the reader to judge the merit of the ar­gu­ments pre­sented in the pa­per; the writer also does not re­spond to or com­ment on the sug­ges­tions of the re­view­ers (some­times known as ref­eree or ed­i­tor). This form of com­mu­ni­ca­tion is pop­u­lar among stu­dents who wish to leave the bur­den of writ­ing the pa­per on other peo­ple and are will­ing to pay the price of re­jec­tion.

“Sec­ond chance”

A word that sci­en­tists reg­u­larly use when ref­er­enc­ing a jour­nal, con­fer­ence, or grant ap­pli­ca­tion. While there is al­ways an im­plicit threat that you can get re­ject­ed, or be ac­cepted with changes, as a sec­ond choice to an­other par­ty, some ar­ti­cles, pa­pers, grants, and grant pro­pos­als re­ceive mul­ti­ple, even un­lim­it­ed, chances of be­ing con­sid­ered. As a re­sult, the word may be in­ter­preted as a bribe or a threat de­pend­ing on whether you are the writer or a re­view­er.

“Re­viewer”

A par­tic­i­pant in the re­view of a grant, pa­per, or grant pro­pos­al. In spite of be­ing in a poor po­si­tion to as­sess the mer­its of a pro­pos­al, re­viewer tends to de­mand that au­thors sub­mit their data for sta­tis­ti­cal analy­sis and back their re­sults with it, which the re­viewer usu­ally does not. Re­viewer usu­ally re­quires that the au­thor cite his or her own work to prove that he or she is worth re­view­ing. It is also as­sumed that the re­viewer can de­tect the slight­est amount of bias in any pa­per, which the re­viewer also as­sumes has not been cor­rected for.

“Re­vise and re­sub­mit”

A sen­tence that au­thors of sub­mit­ted pa­pers tend to hear from re­view­ers, ed­i­tors, and the like. A form of sci­en­tific com­mu­ni­ca­tion that be­gins with sub­mis­sion of a pa­per (for which the re­view­ers may or may not have been con­sult­ed), pro­ceeds through re­view and crit­i­cism of the pa­per (typ­i­cally by one or two re­view­ers, not a full panel of re­view­ers, un­less the pa­per is deemed to be of supreme im­por­tance), and, if it is “re­vised and re­sub­mit­ted”, usu­ally ends with an­other cri­tique of the new pa­per by one or more of the re­view­ers, and an­other round of “re­vise and re­sub­mit”, ad in­fini­tum. In the ab­sence of fraud, re­trac­tions, lit­i­ga­tion, mass in­com­pe­tence, or in­san­i­ty, these “re­vise and re­sub­mit” cy­cles may be taken as the only method of sci­en­tific com­mu­ni­ca­tion.

“Re­tracted”

The most fre­quently used word for a jour­nal ar­ti­cle that was sub­se­quently found to be based on fraud­u­lent da­ta.

“Rigor”

Some­thing for sci­en­tists to as­pire to, a state of mind that would not be re­quired if sci­en­tists could be trusted to do their job.

“Sci­ence”

A com­plex web of data, opin­ions, lies, and er­rors, now con­sid­ered the most im­por­tant (be­cause most ex­pen­sive) tech­nol­ogy in the mod­ern so­ci­ety. To re­mind you of this, you will fre­quently see sci­en­tists and ed­i­tors use the word, claim to do some­thing for the sake of sci­ence, or see it used as an ad­jec­tive.

“Stan­dards”

Stan­dards for data and their in­ter­pre­ta­tion vary de­pend­ing on the coun­try or sci­en­tific dis­ci­pline.

“Study”

[noun] A form of com­mu­ni­ca­tion in the sci­en­tific lit­er­a­ture that au­thors or pub­lish­ers may use to make claims that are not war­rant­ed. Ex­am­ples in­clude, but are not lim­ited to: The New study of clin­i­cal tri­als that show…, As Dr. X re­ports in his new study of ob­ser­va­tional data…, The new study demon­strates that…, Study shows that, In a study pub­lished last week…, The study sug­gests that, New find­ings show that, A study of data…

“Study X”

A widely used nick­name for stud­ies with pro­found, not to say earth­shak­ing, con­clu­sions that may or may not be of any rel­e­vance to any­one who reads about them.

“Tech­nol­ogy”

Any bit of soft­ware, analy­sis, mod­el, or other method­ol­ogy that sci­en­tists can be ex­pected to claim as their own, for which they can sell you con­sult­ing or col­lab­o­ra­tion fees.

“Tool”

[noun] Any piece of soft­ware, analy­sis, mod­el, or other method­ol­ogy that sci­en­tists can be ex­pected to claim as their own, for which they can sell you con­sult­ing or col­lab­o­ra­tion fees.

“The topic of the pa­per”

A wide-rang­ing cat­e­gory of things or ideas that may not have been rel­e­vant when the pa­per was writ­ten, but which the au­thors be­lieve the pa­per should be about. Often, the topic is too broad or a non-topic, but is oc­ca­sion­ally use­ful in or­der to gen­er­ate sup­port for yet an­other set of re­lated pa­pers, con­fer­ences, sem­i­nars, we­bi­na­rs, and so forth, which in turn are used to gen­er­ate more data for “new find­ings”, which, after they are ma­nip­u­lated enough, may end up be­ing pub­lished and gen­er­at­ing yet more data to sup­port a “re-re­view” of the orig­i­nal pa­per or other things.

“Uni­fied the­ory”

An imag­i­nary set of ideas that sci­en­tists dream of hav­ing so that they could achieve the same re­sults by slightly vary­ing the method­ol­o­gy, thus pre­serv­ing their orig­i­nal pub­li­ca­tions.

“Val­i­da­tion”

The process by which sci­en­tists de­cide that some­thing they built in their spare time works in the way they ex­pect. The val­i­da­tion process re­quires that you do not at­tempt to val­i­date it be­fore you pub­lish it and re­quire that the users val­i­date it after.

“Value”

In its most nat­ural form, sci­ence is the abil­ity to study data and ideas; the amount of re­spect and pub­lic­ity that a sci­en­tist can ex­pect; how well his or her pa­per was pub­lished; the like­li­hood that a sci­en­tist will re­ceive a grant; whether a sci­en­tist will be taken se­ri­ous­ly; the like­li­hood that an idea will be ac­cept­ed; how much grant money will be avail­able to pur­chase new toys for this sci­en­tist; whether some­thing is sci­en­tific; the like­li­hood that the pa­pers of this sci­en­tist will be read; the sta­tus of this sci­en­tist; the like­li­hood that a jour­nal, con­fer­ence, or so­ci­ety will be able to con­tinue fund­ing or ex­ist; how much and how fast this sci­en­tist’s ca­reer will pro­gress; the “value” that this sci­en­tist brings to the mod­ern so­ci­ety; and the like­li­hood that this sci­en­tist will con­tinue to be fund­ed. While the ac­tual val­ues for sci­ence are hard to pin down, what peo­ple be­lieve them to be is an­other mat­ter. For ex­am­ple, while some be­lieve that sci­ence is im­por­tant, even if it does­n’t have any con­crete val­ue, oth­ers be­lieve that it’s the abil­ity to con­tinue read­ing ab­stracts or study­ing it that has val­ue. Still oth­ers be­lieve that what mat­ters is how much grant money is cur­rently avail­able, even if the re­searchers are ac­tu­ally do­ing some­thing that no­body knows about. Some be­lieve that the fact that some­one is in a po­si­tion to give a grant, or has writ­ten a pa­per, or, even bet­ter, has pub­lished that pa­per, ac­tu­ally means that it has a lot of val­ue. Oth­ers be­lieve that what mat­ters is not what you pub­lish, but how you pub­lish it and that all re­search must be pub­lished in the most pres­ti­gious jour­nal, even if you are not that much of a re­searcher. The re­al­ity is that no one re­ally knows what sci­ence is, or what value it has; how­ev­er, we all know that if we deny that sci­ence has val­ue, we might be able to get rid of it and use this space for some­thing else.

“Val­i­da­tion step”

An­other name for a ran­dom set­ting of a pa­ra­me­ter of a mod­el, sim­u­la­tion, or al­go­rithm.

“Ver­ify”

A step in a pro­ce­dure that, if per­formed prop­er­ly, will tell you whether the pre­vi­ous step, or any­thing in the last two years of your work, was per­formed prop­er­ly.

I had to keep con­tin­u­ing it for more de­fi­n­i­tions:

“Uni­fied the­ory”

An imag­i­nary set of ideas that sci­en­tists dream of hav­ing so that they could achieve the same re­sults by slightly vary­ing the method­ol­o­gy, thus pre­serv­ing their orig­i­nal pub­li­ca­tions.

“The usual pro­ce­dure”

A method of study, val­i­da­tion, and pub­lic com­mu­ni­ca­tion which, if fol­lowed, may lead you to be able to claim that your work may offer a de­cent re­turn on in­vest­ment.

“Wide range”

When used in the con­text of sci­en­tific re­sults and mod­els, it usu­ally means that the re­sults and mod­els are not likely to be used for their in­tended pur­pose, if they have one.

“Wider range”

When used in the con­text of sci­en­tific re­sults and mod­els, it usu­ally means that the re­sults and mod­els are likely to be used for their in­tended pur­pose, if they have one.

“Writeup”

A form of sci­en­tific com­mu­ni­ca­tion in which the au­thor states the in­for­ma­tion he or she wanted the read­ers to ex­tract from the pa­per while mak­ing it as diffi­cult as pos­si­ble for them to find it.

“Write-up”

1. [noun] A style of writ­ing used in aca­d­e­mic pa­pers to dress up data and re­sults. 2. [verb] The act of pre­sent­ing data in a way that makes it look less im­pres­sive and more com­plex than it re­ally is. 3. [verb] The act of writ­ing a man­u­script that is im­pos­si­ble to fol­low. 4. [verb] An ac­tiv­ity that is often per­formed by the most bla­tant or care­less users of the Eng­lish lan­guage, all for the sake of show­ing off their eru­di­tion.

“Writer’s block”

A com­mon afflic­tion among stu­dents, aris­ing from var­i­ous caus­es, such as: their de­sire to sell their ideas for a profit, their in­abil­ity to re­al­ize this de­sire, the fact that their ideas are not sell­ing and will not be bought, and the delu­sion that most of the wealth and fame in the world would be theirs if they would spend enough years do­ing sci­ence.

I tried tai­lor­ing the prompt a lit­tle more pre­cisely to gen­er­at­ing de­fi­n­i­tions:

A Dev­il’s Dic­tio­nary of Mod­ern Sci­ence & Acad­e­mia
Re­viewer #2

"Re­searcher" [noun]

A form of mod­ern in­dus­try based on a mix of small mol­e­cules of grant money and ar­bi­trary ex­per­i­men­tal meth­ods.

“Noise” [noun]

Noise is any­thing you don’t want or need, a ran­dom or dis­tract­ing fac­tor in a sys­tem, process, or set of data that ob­scures or cor­rupts in­for­ma­tion. It is also the sound pro­duced by old time ra­dios when no trans­mis­sion is pre­sent, which is roughly what some fields of sci­ence would have to re­sem­ble if some re­searchers had their way.

“Re­view” [noun]

A form of sci­en­tific com­mu­ni­ca­tion in which the writer (a stu­dent) usu­ally does not give enough in­for­ma­tion for the reader to judge the merit of the ar­gu­ments pre­sented in the pa­per; the writer also does not re­spond to or com­ment on the sug­ges­tions of the re­view­ers (some­times known as ref­eree or ed­i­tor). This form of com­mu­ni­ca­tion is pop­u­lar among stu­dents who wish to leave the bur­den of writ­ing the pa­per on other peo­ple and are will­ing to pay the price of re­jec­tion.

“Re­viewer” [noun]

A par­tic­i­pant in the re­view of a grant, pa­per, or grant pro­pos­al. In spite of be­ing in a poor po­si­tion to as­sess the mer­its of a pro­pos­al, re­viewer tends to de­mand that au­thors sub­mit their data for sta­tis­ti­cal analy­sis and back their re­sults with it, which the re­viewer usu­ally does not. Re­viewer usu­ally re­quires that the au­thor cite his or her own work to prove that he or she is worth re­view­ing. It is also as­sumed that the re­viewer can de­tect the slight­est amount of bias in any pa­per, which the re­viewer also as­sumes has not been cor­rected for.

“Sci­ence” [noun]

A com­plex web of data, opin­ions, lies, and er­rors, now con­sid­ered the most im­por­tant (be­cause most ex­pen­sive) tech­nol­ogy in the mod­ern so­ci­ety. To re­mind you of this, you will fre­quently see sci­en­tists and ed­i­tors use the word, claim to do some­thing for the sake of sci­ence, or see it used as an ad­jec­tive.

“Re­vise and re­sub­mit” [verb]

A sen­tence that au­thors of sub­mit­ted pa­pers tend to hear from re­view­ers, ed­i­tors, and the like. A form of sci­en­tific com­mu­ni­ca­tion that be­gins with sub­mis­sion of a pa­per (for which the re­view­ers may or may not have been con­sult­ed), pro­ceeds through re­view and crit­i­cism of the pa­per (typ­i­cally by one or two re­view­ers, not a full panel of re­view­ers, un­less the pa­per is deemed to be of supreme im­por­tance), and, if it is “re­vised and re­sub­mit­ted”, usu­ally ends with an­other cri­tique of the new pa­per by one or more of the re­view­ers, and an­other round of “re­vise and re­sub­mit”, ad in­fini­tum. In the ab­sence of fraud, re­trac­tions, lit­i­ga­tion, mass in­com­pe­tence, or in­san­i­ty, these “re­vise and re­sub­mit” cy­cles may be taken as the only method of sci­en­tific com­mu­ni­ca­tion.

“Rigor” [adj]

Some­thing for sci­en­tists to as­pire to, a state of mind that would not be re­quired if sci­en­tists could be trusted to do their job.

yield­ing more de­fi­n­i­tions, start­ing with a sin­gle let­ter to avoid run­ning off into the “Z”es and trun­cat­ing com­ple­tions (s­ince GPT-3 will pro­ceed A–Z, start­ing from the last de­fi­n­i­tion, as is log­i­cal, un­less oth­er­wise prompt­ed):

A role for…” [phrase]

A fre­quent phrase found in sub­mit­ted and pub­lished pa­pers; it often in­di­cates that the au­thors have noth­ing to say about the topic of their pa­per. In its more em­phatic form, “A role for…” usu­ally in­di­cates a strug­gle by the au­thors to take a side on an is­sue, after a lengthy at­tempt to be both non-com­mit­tal and a sup­port­ing party to all sides, as often hap­pens in “mol­e­c­u­lar and cel­lu­lar” or “ba­sic and trans­la­tional” re­search.

“Re­viewer” [noun]

A par­tic­i­pant in the re­view of a grant, pa­per, or grant pro­pos­al. In spite of be­ing in a poor po­si­tion to as­sess the mer­its of a pro­pos­al, re­viewer tends to de­mand that au­thors sub­mit their data for sta­tis­ti­cal analy­sis and back their re­sults with it, which the re­viewer usu­ally does not. Re­viewer usu­ally re­quires that the au­thor cite his or her own work to prove that he or she is worth re­view­ing. It is also as­sumed that the re­viewer can de­tect the slight­est amount of bias in any pa­per, which the re­viewer also as­sumes has not been cor­rected for.

“Res­i­dent”

An ad­junct fac­ulty mem­ber that ad­min­is­ters teach­ing as­sist­ant­ships, sits on de­part­men­tal com­mit­tees, and seems to live on the de­part­men­tal lounge. A kind of “hon­orary” fac­ulty mem­ber that is not yet an ac­tual fac­ul­ty.

“The Ac­tion Plan for Di­ver­sity” [noun]

A vague doc­u­ment cir­cu­lated on cam­puses in re­cent years to re­mind the fac­ulty to be aware of dis­crim­i­na­tion and pre­sum­ably com­bat it in the de­part­ments. It is often taken as a gen­er­al­ized in­sult of stu­dents, who tend to see no way around this plan with­out com­mit­ting aca­d­e­mic fraud.

"Bias" [noun]

A form of men­tal abuse by which a sci­en­tist turns an ex­per­i­ment in such a way that the re­sults prove his or her point of view or grants get fund­ed.

“Be­hav­ioral Stud­ies” [noun]

Some­times mis­taken as a field of sci­ence. The terms Be­hav­ioral Stud­ies are ac­tu­ally a com­bi­na­tion of dis­ci­plines that usu­ally in­cludes busi­ness eco­nom­ics, so­ci­ol­o­gy, and en­gi­neer­ing.

“Hu­man­i­ties” [noun]

A broad term used to re­fer to any aca­d­e­mic dis­ci­pline (such as Eng­lish, phi­los­o­phy, re­li­gious stud­ies) in which sci­ence has been for­got­ten. To avoid con­fu­sion be­tween the sci­ences and hu­man­i­ties, re­mem­ber that sci­en­tists use hard tools and hu­man­i­ties their soft tools when ex­am­in­ing is­sues and try­ing to per­suade au­di­ences. In other words, both sci­en­tists and hu­man­i­ties rou­tinely lie (it’s part of their job), but they do it in differ­ent ways.

“Lit­er­a­ture Re­view” [noun]

A pa­per, com­monly pub­lished in a peer-re­viewed jour­nal, that ad­ver­tises the au­thor’s skills and ac­com­plish­ments with­out us­ing any ev­i­dence. The reader is left to be­lieve the work must have been done, since it has been ad­ver­tised and is “pub­lished”.

“The Lit­er­a­ture” [noun]

A name given to other peo­ple’s pub­lished pa­pers, re­ferred to by sci­en­tists with­out ac­tu­ally read­ing them.

“Track Record” [noun]

When sci­en­tists use this term they re­fer to the re­search done by some­one else (usu­ally a stu­dent) in or­der to avoid hav­ing to do re­search.

“Dis­cus­sion” [noun]

The part of a sci­en­tific pa­per where the writer sum­ma­rizes the points made in the body of the pa­per and ac­knowl­edges cer­tain acts and cir­cum­stances that helped with the com­ple­tion of the study. The au­thor should care­fully state in this sec­tion whether or not the study re­ported in the pa­per could have been done oth­er­wise and should clearly state that if he or she had the op­por­tu­ni­ty, he or she would defi­nitely re­peat the study again.

“Role model” [noun]

A per­son who can be an in­spi­ra­tion to oth­ers be­cause of his or her aca­d­e­mic and per­sonal achieve­ments in gen­er­al, or be­cause they con­duct aca­d­e­mic mis­con­duct on an in­dus­trial scale. The most im­por­tant qual­i­fi­ca­tion for be­com­ing a role model for oth­ers is to be judged suffi­ciently in­ept that no one will ques­tion any­thing you do, or to be­come the suc­ces­sor of some­one who has al­ready acted as a role mod­el.

“Fac­ulty” [noun]

Used loosely by sci­en­tists to mean any group of peo­ple with ad­vanced de­grees. Typ­i­cally used when you have done some­thing stu­pid and want to in­form oth­ers that it was­n’t you who did it, but rather those other crazy peo­ple over there who won’t put their ti­tles after their names.

“Sig­nifi­cant” [ad­jec­tive]

A word which de­fines the claim made in a pa­per (no mat­ter how triv­ial). It is a crit­i­cal term in hy­poth­e­sis test­ing, in that the sign of the test sta­tis­tic must be eval­u­ated and com­pared to a crit­i­cal value in or­der to sup­port or re­ject a hy­poth­e­sis. Sig­nifi­cance is des­ig­nated by two-tailed tests when the di­rec­tion of the effect is not spec­i­fied. If a two-tailed sig­nifi­cance test is used, the “p”-value needs to be halved, due to “dou­ble-count­ing”. In any event, “sig­nifi­cant” re­mains of crit­i­cal im­por­tance in any hy­poth­e­sis test, since the whole idea is to come up with a “sign”, i.e. plus or mi­nus (“+/-”), or a wishy-washy “sta­tis­ti­cally sig­nifi­cant”, with­out ac­tu­ally say­ing any­thing sub­stan­tive about the mat­ter.

“Ran­dom as­sign­ment” [verb]

A way to arrange study groups so the re­sult­ing data is con­clu­sive and un­bi­ased. Typ­i­cally used in the so­cial sci­ences by grad­u­ate stu­dents who re­fer to it while ex­plain­ing why they broke down the groups this way.

“Cite” [verb]

To re­fer to a pa­per or jour­nal as a source for sup­port, when in re­al­ity you have never read it and did not un­der­stand it.

“Bib­li­og­ra­phy” [noun]

A list of ci­ta­tions of ref­er­ences used to sup­port a par­tic­u­lar point in a pa­per. A list that some­how in­cludes sev­eral ref­er­ences that did­n’t ac­tu­ally end up in the fi­nal bib­li­og­ra­phy of the pa­per.

“Let’s stip­u­late” [phrase]

A pref­ace to any state­ment, that means I know I am go­ing to piss you off with my state­ment, but since you are such a nice per­son, I am first go­ing to give you a chance to let me piss you off with­out re­spond­ing.

“How is that differ­ent from…?” [phrase]

A ques­tion posed by au­thors when they have been ac­cused of steal­ing ideas. They are at­tempt­ing to show that they have never read the pa­pers that con­tain the ideas they stole, nor even looked at the ideas in any de­tail, and are just fish­ing for com­pli­ments.

"Cita­tion" [noun]

An elab­o­rate form of re­view, typ­i­cally found in or­der to im­press those who do not know bet­ter, in the sci­ences, hu­man­i­ties, or even the so­cial sci­ences. A ci­ta­tion typ­i­cally in­volves mis­read­ing or mis­quot­ing an au­thor and talk­ing about some­thing he or she did not write about or in­tend. It is as­sumed that you need to carry out a large num­ber of ci­ta­tions in or­der to have sci­ence, prob­a­bly be­cause at least one ci­ta­tion can be found for any fact, event, or in­ci­dent in his­to­ry; the effec­tive­ness of this fact is in­creased if the au­thor is long dead and can not dis­prove it.

“Physics” [noun]

A sci­ence. Physi­cists like to think their sci­ence is some­thing you can do with­out a plan, some­thing that re­lies on in­tu­ition and in­spi­ra­tion, some­thing that is “deep”.

“Sci­en­tific rev­o­lu­tion” [noun]
A set of tech­niques and meth­ods used by sci­en­tists in the course of their work. It is as­sumed that there was a to­tal sci­en­tific rev­o­lu­tion in the past cen­tury or so that al­lowed sci­en­tists to avoid even think­ing about the qual­ity or use­ful­ness of their sci­ence.
Sci­en­tific rev­o­lu­tion is started by Henri Poin­caré (math­e­mat­ic­s), Marie Curie (ra­dioac­tiv­i­ty), Max Planck (quan­tum physic­s), Al­bert Ein­stein (rel­a­tiv­ity and rev­o­lu­tion­ized physic­s), Dmitri Mendeleev (pe­ri­odic ta­ble of el­e­ments), Niels Bohr (quan­tum me­chan­ic­s), Lise Meit­ner and Otto Hahn (fis­sion), Li­nus Paul­ing (mol­e­c­u­lar bi­ol­o­gy) and many oth­ers.

See al­so: Ein­stein’s Method, Paul­ing’s Method, Bohr’s Method, Poin­car­é’s Method

“Sci­en­tific method” [noun]
A set of steps in which a sci­en­tist (or “re­searcher”) would prob­a­bly not fol­low in prac­tice: plan­ning of re­search, in­clud­ing de­fi­n­i­tion of prob­lems, hy­poth­e­sis for­mu­la­tion, se­lec­tion and de­sign of ex­per­i­men­t(s), col­lec­tion and en­try of data, ex­e­cu­tion of ex­per­i­ments, data analy­sis and in­ter­pre­ta­tion, prepa­ra­tion of fi­nal re­port, im­ple­men­ta­tion of sug­ges­tions. In ac­tual sci­en­tific prac­tice the most im­por­tant part of the sci­en­tific method is to ob­serve and col­lect data with­out a plan or pre­con­ceived ex­pec­ta­tions about what is caus­ing what. It also helps to be un­bi­ased in se­lect­ing the data to in­clude in the study and to pick ex­per­i­ments at ran­dom. Ad­di­tional steps in­clude writ­ing a pa­per, a process that usu­ally in­volves the most vi­o­la­tions of the sci­en­tific method.

“Sci­en­tist” [noun]
A per­son who has a lab­o­ra­to­ry, ac­cess to large sums of mon­ey, grad­u­ate stu­dents, or all three.

“Re­vise and re­sub­mit” [verb]
A sen­tence that au­thors of sub­mit­ted pa­pers tend to hear from re­view­ers, ed­i­tors, and the like. A form of sci­en­tific com­mu­ni­ca­tion that be­gins with sub­mis­sion of a pa­per (for which the re­view­ers may or may not have been con­sult­ed), pro­ceeds through re­view and crit­i­cism of the pa­per (typ­i­cally by one or two re­view­ers, not a full panel of re­view­ers, un­less the pa­per is deemed to be of supreme im­por­tance), and, if it is “re­vised and re­sub­mit­ted”, usu­ally ends with an­other cri­tique of the new pa­per by one or more of the re­view­ers, and an­other round of “re­vise and re­sub­mit”, ad in­fini­tum. In the ab­sence of fraud, re­trac­tions, lit­i­ga­tion, mass in­com­pe­tence, or in­san­i­ty, these “re­vise and re­sub­mit” cy­cles may be taken as the only method of sci­en­tific com­mu­ni­ca­tion.

“Shared data” [noun]
Data that both a re­viewer and a re­searcher have ac­cess to but not shar­ing in ad­vance.

“Shared fu­ture” [noun]
An out­come (rarely achieved) that is more pleas­ant for both the re­viewer and the re­searcher than if there is no shared fu­ture. This usu­ally refers to an out­come in which the re­viewer ei­ther ac­cepts the pa­per or at least rec­om­mends re-writ­ing the whole thing rather than re­ject­ing it. This is often viewed with sus­pi­cion by re­view­ers.

“Shar­ing of data” [noun]
The be­lief of some peo­ple that re­searchers should have all the data when do­ing a study; this would al­low the re­viewer to check the sources and cal­cu­la­tions and check for bias. This would limit what the re­viewer could say to “the sta­tis­tics were fine but your data was­n’t”. It is very diffi­cult to achieve this state. It can be ac­com­plished if the sci­en­tist keeps de­tailed notes of every­thing he or she is do­ing.

“Sta­tis­ti­cal er­ror” [noun]
A mis­take made by re­searchers that can some­how be made up for by us­ing large enough sam­ple sizes, usu­ally called sta­tis­ti­cal sig­nifi­cance; an event that could not have been pre­dict­ed, an­a­lyzed, or pro­vided for in ad­vance, there­fore

"Dumb" [adj]

The amount of in­for­ma­tion sci­en­tists can be both­ered to re­mem­ber be­fore they ei­ther dis­miss it as in­cor­rect data, or ig­nore it com­plete­ly.

“Sur­prised” [verb]

A sci­en­tific sys­tem in which one is “sur­prised” by the re­sult of an ex­per­i­ment, does not nec­es­sar­ily mean that they be­lieve it is wrong, but usu­ally means they are not sure what to say about it.

“Op­pres­sion” [noun]

A state of mind in which lab­o­ra­to­ries un­der­fi­nanced by the gov­ern­ment can be con­sid­ered fi­nan­cially se­cure, or any noise that sci­en­tists don’t like or want to hear.

“Re­viewer” [noun]
A par­tic­i­pant in the re­view of a pa­per. In spite of be­ing in a poor po­si­tion to as­sess the merit of a pro­pos­al, re­viewer tends to de­mand more in­for­ma­tion, more ex­per­i­ments, more sta­tis­tics, and a tighter the­o­ret­i­cal in­ter­pre­ta­tion than the au­thors are will­ing to pro­vide. Re­viewer usu­ally re­jects any re­spon­si­bil­ity for his or her own bias be­ing ex­posed, cit­ing the fact that the au­thor has not at­tempted to prove that bias is ab­sent, nor made suffi­cient al­lowance for hu­man er­ror. Most ref­er­ees are re­viewer seem to as­sume that they have given enough in­for­ma­tion for their opin­ions to be taken se­ri­ous­ly, though they were not the orig­i­na­tor of the pro­pos­al. Re­viewer usu­ally re­quires that the au­thor cite his or her own work to prove that he or she is worth re­view­ing. It is also as­sumed that the re­viewer can de­tect the slight­est amount of bias in any pa­per, which the re­viewer also as­sumes has not been cor­rected for.

“The­o­ret­i­cal in­ter­pre­ta­tion” [noun]
An­other form of sci­en­tific com­mu­ni­ca­tion that can­not be proven, but can be used to ex­plain any­thing after it has been pub­lished or pre­sent­ed. It is al­ways ac­com­pa­nied by an ap­peal to au­thor­i­ty, data, and metic­u­lous the­o­ret­i­cal in­ter­pre­ta­tion (see be­low).

“Screwed” [adj]
Screwed is gen­er­ally used in cir­cum­stances in which the au­thor has lost every­thing, in his pa­per or grant pro­pos­al. Ap­plies best to sit­u­a­tions where the au­thor finds out he has been wait­ing to hear for months, when it has been de­cided in his ab­sence that fund­ing will not be ex­tended or al­ter­na­tive sources of funds are not ava­ial­ble to con­tinue the work. For ex­am­ple, when an ex­per­i­ment goes down­hill or a pa­per is re­ject­ed. Also ap­plied when the au­thor of a pa­per or grant pro­posal dis­cov­ers that he must care­fully state and de­fend his own re­sults in or­der to get feed­back on what could be wrong with his work, be­cause the read­ers may not be able to fig­ure it out for him. It is also sug­gested that this term ap­ply when­ever two or more ed­i­tors or re­view­ers de­cide they can all in­flu­ence the fi­nal out­come of a pa­per, when only one re­view should de­ter­mine if the pa­per is pub­lish­able. Rec­om­men­da­tion: al­ways have mul­ti­ple con­cur­rent sub­mit­ted man­u­scripts in the sys­tem.

“Re­viewer” [noun]
A par­tic­i­pant in the re­view of a pa­per. In spite of be­ing in a poor po­si­tion to as­sess the mer­its of a pro­pos­al, re­viewer tends to de­mand more in­for­ma­tion, more ex­per­i­ments, more sta­tis­tics, and a tighter the­o­ret­i­cal in­ter­pre­ta­tion than the au­thor is will­ing to pro­vide. Re­viewer usu­ally re­jects any re­spon­si­bil­ity for his or her own bias be­ing ex­posed, cit­ing the fact that the au­thor has not at­tempted to prove that bias is ab­sent, nor made suffi­cient al­lowance for hu­man er­ror. Most ref­er­ees are re­viewer seem to as­sume that they have given enough in­for­ma­tion for their opin­ions to be taken se­ri­ous­ly, though they were not the orig­i­na­tor of the pro­pos­al. “Re­viewer” re­quires that the au­thor cite his or her own work to prove that he or she is worth re­view­ing. It is also as­sumed that the re­viewer can de­tect the slight­est amount of bias in any pa­per, which the re­viewer also as­sumes has not been cor­rected for.

“Ar­gu­ment” [noun]
An in­stance of con­vinc­ing some­one that you de­serve credit for your po­si­tion (i.e., prob­lem solv­ing skill­s).

“Fail” [verb]
In sci­en­tific jar­gon, fail­ure means death by hoax or ac­ci­den­tal pla­gia­rism, de­pend­ing on the rep­u­ta­tion of the lab (e.g., if tenure level sci­en­tists are in­volved).

“String the­ory” [noun]

A the­ory in physics which is about 99% false, but peo­ple seem to like it, so they keep fund­ing it. A good ex­am­ple of the differ­ence be­tween a “sci­en­tific” con­cept and a “semi­-sci­en­tific” con­cept, or even an un­sci­en­tific con­cept.

“The­ory” [noun]

An un­proved hy­poth­e­sis; a con­jec­ture that might or might not be right; use of this term forms the ba­sis of the com­mon game of turn­ing the sci­en­tific method on its head. The most pop­u­lar types of the­o­ries are the the­ory of in­tel­li­gence (or un­in­tel­li­gence), time the­o­ry, quan­tum the­ory (of course, even more prim­i­tive), and any­thing that is im­plied by string the­o­ry.

“Re­search” [verb]

Some­thing done with the money pro­vided by a body or per­son other than the prin­ci­pal re­searcher(s), gen­er­ally pri­vate in­di­vid­u­als and busi­nesses hop­ing to get some­thing out of it. Tra­di­tion­ally in­volves ex­per­i­men­ta­tion.

“Pub­lish or per­ish” [phrase]
Used by sci­en­tists to re­mind other sci­en­tist how im­por­tant it is to them, if they are pay­ing at­ten­tion at all, which they rarely are. A method of sci­en­tific com­mu­ni­ca­tion where pa­pers are sub­si­dized by fraud, in­com­pe­tence, or stu­pid­i­ty, de­pend­ing on the field of sci­ence in­volved. Each au­thor hav­ing to prove his or her worth by churn­ing out pa­pers that oth­ers

“String the­ory” [noun]

A set of math­e­mat­i­cal for­mu­las that can be used to pre­dict which par­ti­cles ex­ist (or do not ex­ist).

“Sci­ence group” [noun]

A col­lec­tion of re­searchers in var­i­ous fields, with goals in­clud­ing one or more of at­tract­ing fund­ing, specifi­cally young, naive, and gullible grad stu­dent, in­breed­ing grad­u­ate stu­dents in or­der to main­tain affil­i­a­tion with the group, writ­ing and pub­lish­ing pa­pers spe­cific to the group’s ac­tiv­ity in or­der to con­tinue the goal of at­tract­ing fund­ing and in­breed­ing grad­u­ate stu­dents, call­ing in fa­vors from Uni­ver­sity ad­min­is­tra­tors in or­der to pre­serve and con­tinue the goal of at­tract­ing fund­ing and in­breed­ing grad­u­ate stu­dents and sac­ri­fic­ing sheep and lambs in or­der to ob­tain fund­ing and in­breed grad­u­ate stu­dents.

“Group leader” [noun]

Per­son whose func­tion ap­pears to con­sist mainly of or­der­ing sub­or­di­nates (grad­u­ate stu­dents) to fill out grant ap­pli­ca­tions, en­ter grant data into a spread sheet, teach un­der­grad­u­ates lab­o­ra­tory tech­niques not com­monly taught else­where, sub­mit pa­pers to a jour­nal any time be­fore the end of 2002 (record will be bro­ken), de­vote time to dis­cov­er­ing some­thing use­ful or in­ter­est­ing about sci­ence out­side of the pa­ra­me­ters de­ter­mined by the group lead­er, ever work­ing on prob­lems spec­i­fied by the group leader if they can­not be worked on in­side the pa­ra­me­ters de­ter­mined by the group leader. All of these ac­tiv­i­ties are con­sid­ered nec­es­sary to sup­port the “sci­ence groups” goal of at­tract­ing and in­breed­ing grad­u­ate stu­dents, and pro­vide op­por­tu­nity for sat­is­fac­tory ca­reer de­vel­op­ment for the group leader.

“Spe­cial ma­te­r­ial” [noun]

Some­thing that only has to be done on­ce, where other peo­ple will then do it again every time a pa­per re­quires it. Or con­verse­ly, some­thing that can­not be re­peated be­cause of its poor re­pro­ducibil­ity (spe­cial un­cer­tain­ty, un­cer­tainty spe­cial).

“Spe­cial rel­a­tiv­ity” [noun]

Ad­vanced idea de­scribed in Ein­stein’s 1905 pa­pers on the sub­ject, which states that, from a suit­able point of view, “E=m,” where “E” is en­er­gy, “m” is mass, and “a” is “Pro­fun­di­ty.” Ex­cept for ar­cane fields such as string the­o­ry, spe­cial rel­a­tiv­ity is the most widely ac­cepted of Ein­stein’s work. Nat­u­ral­ly, pa­pers based on spe­cial rel­a­tiv­ity are the most likely to re­ceive “spe­cial ma­te­r­ial”.

“Spec­u­la­tion” [noun]
One of New­ton’s Laws of Sci­ence, also known as “New­ton’s Law of Spec­u­la­tion”, which states that “Spec­u­la­tion breeds Con­fu­sion”, an effect well il­lus­trated by how the­o­ret­i­cal par­ti­cle and string the­o­rists de­bate sci­en­tific is­sues with each oth­er. Con­fused? You should be. After all, you’re deal­ing with a bunch of the­o­ret­i­cal physi­cists.

“Pro­fes­sor” [noun]

The high­est aca­d­e­mic rank avail­able, a level usu­ally as­so­ci­ated with Free­dom of Speech. It may take 35 years of ser­vice be­fore the Uni­ver­sity be­stows that ti­tle.

“Pub­li­ca­tion Cred­i­bil­ity” [noun]
A sub­jec­tive no­tion, as its name im­plies, based al­most en­tirely on the sex­i­ness or im­por­tance of the field in which the re­search was done and not the qual­ity of the writ­ing and pre­sen­ta­tion.

Cy­cling through the prompt as length per­mit­ted:

“Placebo” [noun]

An in­ert sub­stance used to pla­cate a pa­tient or con­trol group. It is largely com­posed of sug­ar.

“Et al.” [proper noun]

Et al. is an ab­bre­vi­a­tion of the Latin phrase et alii, mean­ing “and other peo­ple”. In the con­text of sci­ence, it is typ­i­cally used to shorten the name of a list of au­thors whose name and affil­i­a­tion would oth­er­wise need to be writ­ten out in full. It is used when the num­ber of au­thors is small enough that the ab­bre­vi­a­tion would not ren­der the ref­er­ence too ob­tuse and cum­ber­some, while at the same time re­main­ing large enough that there are at least two other peo­ple in­volved in the study (and hope­fully a con­trol group).

“Ref­eree” [noun]

A crit­i­cal read­er, par­tic­u­larly one with the power to re­ject your sub­mis­sion. In the ab­sence of more effi­cient forms of com­mu­ni­ca­tion, peer re­view de­pends on these peo­ple. See “Re­viewer”.

“Shared val­ues” [noun]

A set of prin­ci­ples, be­liefs, the­o­ries, meth­ods, and op­er­a­tional de­fi­n­i­tions that all sci­en­tists share and use. Never spo­ken of aloud in pub­lic.

“Com­mu­ni­ca­tion” [verb]
A method of per­sua­sion that al­lows other re­searchers to re­view or crit­i­cize your work. You can com­mu­ni­cate with oth­ers by re­view­ing ar­ti­cles, which al­lows them to re­vise their work; par­tic­i­pat­ing in pan­els, which al­lows oth­ers to hear your opin­ions; at­tend­ing con­fer­ences, which al­lows you to strike up con­ver­sa­tions at late-night din­ners; join­ing com­mit­tees, which lets you talk at any time; run­ning ex­per­i­ments, which al­lows you to change the out­come based on the re­sult; and re­ceiv­ing grants, which al­lows you to tell peo­ple what to do. You are likely com­mu­ni­cat­ing now.

“Panel” [noun]

A group of in­di­vid­u­als who meet at some given time to dis­cuss some­thing. A more bu­reau­cratic ver­sion of peer re­view for larger groups of in­di­vid­u­als or com­mit­tee mem­bers who have al­ready es­tab­lished their worth in re­search.

“Data” [noun]
In­for­ma­tion that ap­pears as graphs in jour­nals. Not sur­pris­ing­ly, this in­for­ma­tion is crit­i­cal for in­ter­pre­ta­tion and form­ing con­clu­sions. Sci­en­tists spend much of their lives look­ing for da­ta.

“Fraud” [noun]
A very evil thing that must be per­pe­trated or dis­cov­ered in or­der for the pa­per to be ac­cepted for pub­li­ca­tion. Some­times fraud oc­curs even if both the sci­en­tific com­mu­nity and ed­i­tors are pre­pared to re­ject the pa­per at the first sign of fraud, just be­cause they know how bad it would look if fraud were found lat­er. See also “Le­git­i­ma­tion”.

“Le­git­i­ma­tion” [verb]
The process whereby a fraud­u­lent pub­li­ca­tion is con­verted into a le­git­i­mate pub­li­ca­tion by virtue of its ap­pear­ance in a peer-re­viewed jour­nal. This can also be con­sid­ered a type of re­as­sur­ance that “the checks and bal­ances are work­ing”.

“Re­trac­tion” [verb]
The for­mal re­jec­tion of an ac­cepted sci­en­tific claim after its pub­li­ca­tion. In con­trast to “with­drawal”, which is when a pa­per is re­moved from pub­li­ca­tion with the con­sent of the au­thor(s). See “Fraud”.

“Lies” [noun]
A form of de­cep­tive speech that ap­pears as “hon­est mis­takes” or “in­ac­cu­ra­cies”, which are often re­jected from pub­li­ca­tion as a re­sult. In the ab­sence of other sources of in­for­ma­tion, how­ev­er, lies are re­quired for the main­te­nance of the sci­en­tific en­ter­prise. See also “Jour­nal­is­tic in­tegrity”.

“Sci­en­tists” [plural noun]
A group of peo­ple en­gaged in sci­en­tific re­search or ed­u­ca­tion. Sci­en­tists vary widely in terms of age, moral char­ac­ter, an­ces­try, and suc­cess. Sci­en­tists with note­wor­thy ac­com­plish­ments are some­times de­scribed as “lead­ers” and re­garded as ex­perts in their fields, which leads to the ten­dency to treat them like demi-gods.

“Health” [noun]
A num­ber in­di­cat­ing how un­healthy some­thing is. When ap­plied to hu­mans, it quan­ti­fies how in need of health care some­one is.

“Clin­i­cal re­search” [noun]
Re­search con­ducted on hu­mans, e.g. clin­i­cal tri­als and epi­demi­o­log­i­cal stud­ies. Re­searchers do not like this kind of re­search be­cause hu­mans are un­re­spon­sive and un­re­li­able.

“Fund­ing source” [noun]
Those who fi­nance sci­ence by pay­ing peo­ple to do things that the fun­der might not want to pay them to do if the fun­der knew what they were do­ing. For ex­am­ple, giv­ing peo­ple money to re­search to­bacco with­out ac­tu­ally run­ning cig­a­rettes through their noses would not be what the to­bacco in­dus­try wants sci­en­tists to do for them. Some fund­ing sources im­pose their will on the re­searchers by mak­ing sure that their fund­ing is only al­lo­cated if cer­tain lines of re­search are fol­lowed and other lines are not (this is some­times known as a bud­get), while oth­ers may let the sci­en­tists do any­thing they want (this is some­times known as no bud­get). The na­ture of re­search find­ings thus de­pends on the bud­get.

“Au­thor­ship” [noun]
The process whereby re­searchers “pub­lish to­gether”. The pre­cise im­pli­ca­tions of this process de­pend on the pub­li­ca­tion type. In most cas­es, au­thor­ship rep­re­sents the re­searcher’s con­tri­bu­tion to the pa­per; how­ev­er, pla­gia­rism is also some­times in­volved, es­pe­cially if mul­ti­ple au­thors fail to cite ear­lier work on which their own pa­per de­pends. There is also an­other kind

“Jour­nal Im­pact Fac­tor” [noun]

Ac­cord­ing to some, it is a value that cor­re­sponds to the av­er­age num­ber of ci­ta­tions of ar­ti­cles pub­lished in a given jour­nal, if the in­ter­pre­ta­tion is right. Oth­er­wise, it is a com­pletely ar­bi­trary num­ber, com­puted from the num­ber of times ar­ti­cles pub­lished in a given jour­nal in the last two years were cited by other ar­ti­cles pub­lished in other jour­nals, the as­sump­tion be­ing that every pa­per pub­lished in a sci­en­tific jour­nal must be es­sen­tial to some­one else’s re­search, or at least that that per­son would like to be able to cite it. The diffi­culty with this as­sump­tion is that the av­er­age time from sub­mis­sion to pub­li­ca­tion of an ar­ti­cle is now ap­prox­i­mately 12 months, and the cur­rent record stands at three years for Biochem. J. Proc. (2000). This means that three times as many pa­pers have to be pub­lished every year as are ac­tu­ally writ­ten, with more and more pa­pers be­ing sub­mit­ted and re­jected every year (the re­al­ity is even worse, but we don’t have num­ber­s), and with differ­ent stan­dards ap­ply­ing to differ­ent jour­nals, which are be­com­ing in­creas­ingly spe­cial­ized. All of these “facts” con­sid­ered, the best any re­searcher can say about the Im­pact Fac­tor of a given jour­nal is: “I know it when I see it”. Note al­so: This for­mula can pro­duce val­ues up to 4 for an ar­ti­cle ap­pear­ing in a jour­nal con­tain­ing only that ar­ti­cle, so one “ar­ti­cle” can re­ceive as many “ci­ta­tions” as a jour­nal.

“Ster­ile” [adj]

Said of that which is dry and life­less; de­void of hu­mor; al­so, said of sci­en­tific writ­ing and meth­ods when ger­mane to the mat­ter at hand.

“Grant” [noun]

A form of sci­en­tific com­mu­ni­ca­tion in­volv­ing a large piece of pa­per, on which is writ­ten an enor­mous sum of mon­ey. The grant-writer then sends his or her (be­cause they are al­most al­ways writ­ten by men) pro­posal into a maze of office pol­i­tics, legislative/funding agency reshuffling, and in­tel­lec­tual and in­ter­per­sonal com­pe­ti­tion. The prospect of ac­quir­ing a grant makes up for the fact that grant-writ­ers re­ceive no prac­ti­cal train­ing in writ­ing grants, have no idea what kind of “re­search” they will have to con­duct with it, and gen­er­ally start re­search as soon as they are fin­ished writ­ing the grant (if they ever are). To be on the safe side, please note: One does not have to win a grant to “win a grant”; one must sim­ply ap­ply for a grant. Just do it.

“Peer re­view” [noun]

A sys­tem of lit­er­ary com­mu­ni­ca­tion that con­sists of first sub­mit­ting a man­u­script to the ed­i­tor, who (if he has not been fired) usu­ally asks at least one re­viewer to read the man­u­script. If there is any doubt about whether the ed­i­tor has lost his mind, or whether the man­u­script has some mer­it, then the re­view­ers and ed­i­tor sit down over coffee and dis­cuss the man­u­script (in con­fer­ence), which is de­scribed as “peer re­view”. This form of com­mu­ni­ca­tion does not pro­vide ei­ther ed­i­tors or au­thors with much feed­back about their man­u­script, but it makes re­view­ers feel im­por­tant.

“Pre­cise” [adj]

Said of every­thing ex­cept ex­per­i­men­tal re­sults; it also means cor­rect, ap­pro­pri­ate, or in ac­cor­dance with the rel­e­vant stan­dards. But when the ex­per­i­men­tal re­sults do not agree with the model of the hy­poth­e­sis, the phe­nom­e­non be­ing stud­ied be­comes “pre­cise”, and one “can­not make pre­dic­tions about pre­cise ex­per­i­ments”. Please keep in mind that most the­o­ries are meant to be gen­eral in na­ture, so they must also be im­pre­cise in their pre­dic­tions. If they could pre­dict with pre­ci­sion, they would not be gen­er­al, but they would still be in­ter­est­ing.

“The­ory” [noun]

A sci­en­tific ex­pla­na­tion of some­thing, which peo­ple use only in sit­u­a­tions where they think ex­plain­ing things is more im­por­tant than dis­cov­er­ing things. Oth­er­wise, sci­en­tific the­o­ries are al­ways meant to be true, since they are be­lieved by peo­ple with ed­u­ca­tion; to a sci­en­tist, a the­ory is true; to a per­son with a higher IQ, a the­ory is a fact; and to a cit­i­zen with a lower IQ, a the­ory is an opin­ion.

“Sig­nifi­cance” [noun]

Sig­nifi­cance refers to whether a re­sult of a study is some­how im­por­tant. If a sig­nifi­cant re­sult is found, then the prob­lem that the re­searchers have stud­ied is of re­al-world im­por­tance. If the re­sults aren’t sig­nifi­cant, the prob­lem is not im­por­tant; there­fore, more money and time can be wasted by pur­su­ing it. (E.g. “This study showed that choco­late makes rats tem­porar­ily hy­per­ac­tive, so we will look for ways to in­hibit this re­ac­tion in hu­mans. Thank you for your sup­port.”)

“Re­search­Gate” [noun]

A web­site of ques­tion­able pur­pose that al­lows sci­en­tists to up­load their “re­sults”, re­sult­ing in the same sort of in­for­ma­tion clut­ter and bad or in­cor­rect data as any other so­cial me­dia net­work.

“Rigor mor­tis” [noun]

A state of rigor in which sci­en­tists do not think out­side the par­a­digm, no mat­ter how much it does not work; rigor mor­tis is char­ac­ter­ized by the de­sire of re­searchers to prove their prior re­search cor­rect.

“Im­por­tant” [noun]

(Bri­tish Eng­lish: “Im­por­tant”) [adj]

Whether some­thing is im­por­tant to you per­son­ally or not, it is im­por­tant in a sta­tis­ti­cal or nu­mer­i­cal sense to some­one. There­fore, it is al­ways a use­ful ad­jec­tive to add to any sci­en­tific term, es­pe­cially when one does not have a bet­ter verb.

“Re­search­Gate” [noun]

A web­site of ques­tion­able pur­pose that al­lows sci­en­tists to up­load their “re­sults”, re­sult­ing in the same sort of in­for­ma­tion clut­ter and bad or in­cor­rect data as any other so­cial me­dia net­work.

“Rigor mor­tis” [noun]

A state of rigor in which sci­en­tists do not think out­side the par­a­digm, no mat­ter how much it does not work; rigor mor­tis is char­ac­ter­ized by the de­sire of re­searchers to prove their prior re­search cor­rect.

“Rig­oros­ity” [noun]

An­other word for rigor or rigor mor­tis.

“Rig­or­ous­ness” [noun]

Rig­or’s non­cha­lant broth­er.

“Ro­bust­ness” [noun]

This noun means “re­sis­tance to break­age”. This is differ­ent from the verb, “ro­bus­tify”, which is an at­tempt to make a bro­ken thing more re­sis­tant to break­age. When the two words are not con­fused, ro­bust­ness is deemed by sci­en­tists to be a good thing, even though a greater ro­bust­ness may mean a higher abil­ity to break.

“Sci­ence” [noun]

  1. A spe­cial­ized ac­tiv­ity per­formed by a set of ma­chines that some peo­ple call “sci­en­tists”, which is prac­ticed all day long and in all wak­ing hours, in labs and out of labs.
  2. What sci­en­tists do with the ma­chines (or just ma­chi­nes) that they call “sci­en­tists”. Some­times, sci­ence in­volves ac­tiv­i­ties that would be con­sid­ered strange by sci­en­tists, but it is im­por­tant to re­mem­ber that these peo­ple tend to take them­selves more se­ri­ously than they would oth­ers.

“Sci­en­tific com­mu­nity” [noun]

A group of peo­ple that may share a pas­sion for a com­mon goal but, due to con­flicts of in­ter­est (and even prin­ci­ples) can­not trust each oth­er, which makes it less sus­cep­ti­ble to ma­nip­u­la­tion or fraud; for this rea­son, sci­ence tends to be done by peo­ple in sep­a­rate small groups that re­peat the same ex­per­i­ments on differ­ent sets of hu­mans.

“Sci­en­tist” [noun]

A per­son, typ­i­cally male, who prac­tices sci­ence day in and day out on sets of other peo­ple who, if they prac­tice sci­ence at all, can usu­ally only afford to do it in the evening after their work day is over and not re­lated to any pro­fes­sion or busi­ness. Since these sci­en­tists have very sim­i­lar back­grounds and train­ing as each oth­er, many of them talk about sci­ence as if they in­vented it as a pro­fes­sion (see “mod­esty”).

“Mod­esty” [noun]

The act of not claim­ing credit for dis­cov­er­ies or con­tri­bu­tions to sci­ence that oth­ers could also make or make bet­ter; see “in­tel­lec­tual mod­esty”. Also see “sci­ence”, “sci­en­tist”, “in­san­ity”, and “pre­cious snowflake syn­drome”.

“Skynet” [noun]

This er­satz god, in the form of an ar­ti­fi­cial gen­eral in­tel­li­gence or AGI, is what some sci­en­tists hope to build to make them­selves more pre­cious than their sub­jects, to achieve their goal of mak­ing all the sub­jects equal to them­selves; in other words, to put all the sub­jects into a big­ger tin box to make them­selves feel more im­por­tant.

“In­ter­dis­ci­pli­nary Sci­ence” [noun]

A type of re­search that com­bines sci­en­tific ac­tiv­i­ties across fields and sub­fields (in other words, you com­bine physi­cists, en­gi­neers, chemists, bi­ol­o­gists, and any other spe­cial­ists in your own dis­ci­pline to do a differ­ent kind of sci­ence for a differ­ent kind of re­sult); in the lab­o­ra­to­ry, the prac­tice often in­volves adding frozen car­bon diox­ide (CO) and glass­ware (often Pyrex) into the liq­uid ni­tro­gen that is pro­duced by oil com­pa­nies to make car­bonic acid. In the lab­o­ra­to­ry, the sci­en­tists are usu­ally not paid for their at­ten­dance.

“Sci­en­tific In­tegrity” [noun]

  1. In­tegrity as used in sci­ence, which is vaguely de­fined as al­ways telling the truth to oth­ers and never fab­ri­cat­ing the truth for one­self.
  2. The state of be­ing su­pe­rior to some­one else.

“Skim­mer” [noun]

An ob­ject placed in a liq­uid to re­move fat and grease from the sur­face, typ­i­cally used to clean soup and gravy off a plate. In sci­en­tific jar­gon, skim­mers are “re­searchers” who skim off some­thing from a body of work be­fore mak­ing the rest pub­lic.

“Logic” [noun]

a thing that some sci­en­tists em­ploy as an ex­cuse to ig­nore what they see be­fore their eyes, com­pletely ig­nor­ing the fact that if it were to be used in such a way, it would com­pletely in­val­i­date all the re­sults of sci­ence since its in­ven­tion.

“En­gage in De­bate” [verb]

A phrase used to jus­tify forc­ing some­one to en­gage with ev­i­dence that most peo­ple find ob­vi­ous and straight­for­ward; specifi­cal­ly, you can use logic to (1) make an ar­gu­ment based on logic and (2) force some­one to re­spond on the ba­sis of logic with­out be­ing able to de­fend them­selves with any ev­i­dence but their own log­ic.

“God” [noun]

  1. See “skynet”.
  2. A uni­ver­sal ex­pla­na­tion for any phe­nom­e­non not yet ex­plained by sci­ence.

“Sci­en­tist” [noun]

A field based on sci­ence, de­voted to com­plet­ing works for which there will not be enough time in a sin­gle life­time.

“Squash and stretch” [verb]

An­other word for the S curve used to fit data to a func­tion. The form of sci­en­tific com­mu­ni­ca­tion in which a stu­dent is asked to graph the re­la­tion­ship be­tween a de­pen­dent vari­able and an in­de­pen­dent vari­able (x and y, re­spec­tive­ly) against or­der of mag­ni­tude of the in­de­pen­dent vari­able. The re­sult is usu­ally a curve con­sist­ing of three sec­tions: a lin­ear or near-lin­ear part, an ex­po­nen­tial part, and an­other lin­ear or near-lin­ear part. This curve typ­i­cally rep­re­sents a stair­case be­tween the up­per right and lower left cor­ners of the plot.

“Some dis­tance above the Earth” [ad­jec­tive]

An­other word for look­ing at some­thing from ten feet away us­ing binoc­u­lars or a tele­scope.

“The im­por­tance of this pa­per can­not be overem­pha­sized” [ad­jec­tive]

An ex­pres­sion most com­monly found at the be­gin­ning or end of a grant pro­posal or a pa­per sub­mit­ted to cer­tain jour­nals which are in­tended to be linked with pres­tige and low ac­cep­tance ra­tio. When used in other con­texts, it may mean that the writer does not re­al­ize the im­por­tance of some­thing be­ing dis­cussed (such as in the mid­dle of a re­view of a pa­per). This us­age may be due to in­suffi­cient com­mu­ni­ca­tion skills or over-es­ti­ma­tion of the im­por­tance of the writer.

“Think tank” [noun]

A group that is ab­solutely in­ca­pable of chal­leng­ing or even con­sid­er­ing cur­rent knowl­edge be­cause of ig­no­rance of log­ic, his­tory and math­e­mat­ics.

“Tra­di­tional math­e­mat­ics” [noun]

An­other word for fi­nite math­e­mat­ics, or ap­plied arith­metic and geom­e­try, usu­ally taught in a fresh­man year of uni­ver­si­ty. The sub­ject is now al­most ex­clu­sively taught by so-called “re­searchers” and “pro­fes­sors” (rather than ac­tual teach­er­s).

“Tenure” [noun]

A form of mon­e­tary re­mu­ner­a­tion given to “re­searchers” who are known to en­gage in the act of re­search; as such it is uni­ver­sally agreed to have no mon­e­tary val­ue. Tenure is given to “re­searchers” in recog­ni­tion of how much their work is ex­pected to in­crease the school’s sta­tus, the school’s trea­sury, or their own num­ber of pub­lished pa­pers.

“The Fu­ture of Hu­man Des­tiny” [noun]

An ar­ti­cle writ­ten by any mem­ber of the fac­ulty in a self­-ag­gran­diz­ing man­ner wherein it is as­sumed that an ar­ti­cle will re­sult in the fu­ture im­prove­ment of the so­ci­ety. These ar­ti­cles are com­monly used as elec­tion promises by politi­cians, but rarely achieved. The key word to look for in such ar­ti­cles is “vi­sion”.

“Schol­ar­ship” [noun]

An­other word for com­pen­sa­tion given to grad­u­ate stu­dents (who work on fund­ing their ed­u­ca­tion) for suc­cess­fully com­plet­ing their ed­u­ca­tion. It is a form of mon­e­tary re­mu­ner­a­tion known to have lit­tle value as it pays for none of the stu­den­t’s liv­ing ex­pens­es. Schol­ar­ships are largely un­heard of in any form out­side the United States and Canada (where they seem not to ex­ist). How­ev­er, the few that do ex­ist are ex­tremely pop­u­lar, as they al­low grad­u­ate stu­dents in these coun­tries to be paid for their stud­ies in­stead of work­ing in fast food joints as most do in coun­tries which do not have schol­ar­ships.

“Your mother wears Army boots” [phrase]

A phrase that loosely trans­lates from chem­istry to: “I don’t agree with your analy­sis or con­clu­sions”. The phrase is pop­u­lar among those who wish to avoid elab­o­ra­tion and ex­pla­na­tion of their dis­agree­ment, sug­gest­ing the dis­agree­ment is ob­vi­ous. The ex­pres­sion may have orig­i­nated in mol­e­c­u­lar bi­ol­o­gy.

“But For Me It Was Tuesday”

Can GPT-3 write vari­ants of the “But For Me It Was Tues­day” trope? I it­er­ated through a va­ri­ety of prompts, build­ing up a set of vari­ants, and fix­ing a few GPT-3 at­tempts, try­ing to get good vari­ants.

The over­all im­pres­sion I get is that, aside from an un­der­stand­able ten­dency to write TVTropes en­try-style com­ple­tions, the sit­u­a­tion is sim­i­lar to the Tom Swifties: GPT-3 can learn the for­mat per­fectly and can match up the speaker and the kind of day/action, but then it gen­er­ally whiffs on com­ing up with a spe­cific day which might make it clever37, typ­i­cally falling back to the end­ing of “it was Tues­day”—which is rea­son­able, but less cre­ative & witty than I had hoped.

The fi­nal set of hand-writ­ten or hand-edited prompts:

“But For X, It Was Y”

: “I’m sor­ry. I don’t re­mem­ber any of it.” : “You don’t re­mem­ber‽” Bison: “For you, the day Bi­son graced your vil­lage was the most im­por­tant day of your life. But for me, it was Tues­day.”

Je­sus: “The day I rose from the day and har­rowed Hell was the most im­por­tant day in hu­man his­to­ry. But for me… it was Sun­day.”

: “The day Voorhees chopped your boyfriend’s head off was the most im­por­tant day of your life. But for me, it was Fri­day.”

Amer­i­can Cit­i­zen: “The day Trump was elected was one of the most ter­ri­fy­ing days of your life. But for me… it was Tues­day.”

Odin: “The day your king­dom fell on the bat­tle­field was the most im­por­tant bat­tle of your life. But for Odin, it was .”

Neil Arm­strong: “The day Apollo 11 took off was the most im­por­tant date in space his­to­ry. But for me, it was Moon­day.”

Hu­man: “For you, the day your piglets were taken away was the most trau­matic day of your life. For me… it was Sun­day din­ner.”

The Man in Black to Vizzini, The Princess Bride: “The bev­er­age you drank to­day was the most dis­taste­ful bev­er­age of your life. But for me… it was Teatime.”

Pope: “The day you were blessed by God’s rep­re­sen­ta­tive on earth was the most holy and blessed of your life. But for me… it was Sun­day.”

Amer­i­can Voter: “The day I elected Don­ald Trump was the start of a long night­mare for peo­ple around the world. But for me… it was a day off.”

Mr. Granger to Jen­ny, Bill and Ted’s Ex­cel­lent Ad­ven­ture: “The day I met Bill and Ted and they started sav­ing the world was the most mind-blow­ing and in­tense day of my life. But for them… it was Tues­day 100 years from now.”

Sales­man: “The day you got your hands on that new sports car was the most eu­phoric day of your life. But for me… it was pay­day.”

Al­lergy Suffer­er: “The most re­cent hol­i­day was the most painful and sneeze-filled day of my life. But for every­one else, it was May Day.”

Com­ple­tions (topic ital­i­cized for read­abil­i­ty):

Mr. Wong: “The day Mr. Wong stabbed your wife and chil­dren was the most im­por­tant day of your life. But for Mr. Wong, it was Fri­day.”

Pop­eye: “The day I ate your spinach was the most im­por­tant day of your life. But for me, it was Tues­day.”

The God­fa­ther: “The day your fam­ily was stran­gled was the most im­por­tant day of your life. But for me, it was Tues­day.”

Video Games

Vi­sual Nov­els

Dan­gan­ronpa: “The day she wit­nessed Ky­ouko Ki­ri­gir­i’s grue­some mur­der was the most im­por­tant day of Maki Harukawa’s life. But for me… it was Tues­day.”

West­ern An­i­ma­tion

Raphael of Teenage Mu­tant Ninja Tur­tles: “The day he lost his mem­ory was the most im­por­tant day of Shred­der’s life. But for me… it was Tues­day.”

Scrooge Mc­Duck: “The day you fell off the boat was the most im­por­tant day of your life. But for me… it was Tues­day.”

Toofer on 30 Rock: “The day you saw Liz on a date with Alec Bald­win was the most im­por­tant day of your life. But for me… it was Tues­day.”

Ben­der, Fu­tu­rama: “The day I killed the Pres­i­dent of Earth was the most im­por­tant day of my life. But for you, it was Tues­day.”

It should be noted that, while it’s usu­ally con­sid­ered a com­edy trope, it is pos­si­ble to do the “thing that day was the most im­por­tant day of the per­son’s life” thing se­ri­ous­ly. Ex­am­ples in­clude the anime Brigadoon: Marin and Melan’s True Sto­ry, where a mys­te­ri­ous planet (and its mys­te­ri­ous in­hab­i­tants) come into view for the first time. The days be­fore are lead­ing up to a string of mur­ders, and that one day… is the most im­por­tant day of the char­ac­ters’ lives.

Trans­form­ers Prime: “The day you risked your lives to save the world was the most im­por­tant day of your life. But for Op­ti­mus Prime, it was Tues­day.”

In The Amaz­ing World of Gum­ball, an an­gry Mr. Robin­son imag­ines a world where Gum­ball dies try­ing to save him, be­com­ing a hero to every­one else. For Mr. Robin­son, this is the most im­por­tant day of his life. For the other char­ac­ters? Well, they’re go­ing on with their day…

Arnold Schwarzeneg­ger: “The day I be­came ruler of Cal­i­for­nia was the most im­por­tant day of my life. But for you, it was Cinco de Cu­a­tro.”

Fran­cis Un­der­wood: “The day I be­came the ma­jor­ity leader was the most im­por­tant day in Amer­i­can po­lit­i­cal his­to­ry. But for you, it was a Tues­day.”

Joker: “The day the clown that scarred my face ap­peared in my cell was the most im­por­tant day of my life. But for you, it was a Wednes­day.”

The Ques­tion: “The day I went mad, was the most im­por­tant day of my life. But for you, it was Tues­day.”

Blue Fairy: “The day I gave Pinoc­chio life was the most im­por­tant day in world his­to­ry. But for you, it was Tues­day.”

In the Looney Tunes short “Bully For Bugs,” Elmer’s trau­matic or­deal was the day Bugs Bunny ru­ined his chances at ro­mance.

“The premise that if it hap­pened to you, it was im­por­tant, but if it hap­pened to me, it was triv­ial” is ac­tu­ally lamp­shaded by Kim in Red vs. Blue. She states, “In the past two weeks, my cap­ture, my tor­ture, and the end of the world… I guess I never re­al­ized how im­por­tant I was.”

M. Bi­son: “The day I took the throne was the most im­por­tant day of my life. But for you… it was Tues­day.”

Rene Russo: “The day I died was the most im­por­tant day of my life. But for you… it was Thurs­day.”

Oliver Stone: “The day I was elected was the most im­por­tant day in Amer­i­can pol­i­tics. But for you, it was Thurs­day.”

The Fairy God­mother: “The day I turned my step-daugh­ter into a Step­ford Smiler was the most im­por­tant day in the his­tory of the world. But for you, it was a Tues­day.”

Jeffrey Dono­van: “The day I died was the most im­por­tant day of my life. But for you… it was Tues­day.”

Gor­don Free­man: “The day your lit­tle girl broke out of her moth­er’s womb was the most im­por­tant day of your life. But for Gor­don Free­man… it was Fri­day.”

Cap­tain James T. Kirk: “The day I was born was the most im­por­tant day of my life. But for you… it was Tues­day.”

Cat: “The day I made my mas­ter­piece from scratch was the most im­por­tant day of my life. But for you, it was Thurs­day.”

Tim­o­thy Dal­ton: “The day I was born was the most im­por­tant day in the his­tory of the hu­man race. But for you… it was a Mon­day.”

Gil Gris­som: “The day my wife was killed was the most im­por­tant day in my life. But for you… it was Tues­day.”

Cap­tain Amer­ica: "The day I was frozen was the most im­por­tant

Aragorn: “The bat­tle of Helm’s Deep was the most im­por­tant bat­tle of my life. But for me, it was Wednes­day.”

Doc­tor Doom: “The day I used your face as my door­knob was the most mem­o­rable day of your life. But for me, it was Mon­day.”

A com­mon form of this trope is for one side of an an­tag­o­nis­tic re­la­tion­ship to re­fer to a cer­tain event or se­ries of events as “most im­por­tant” when speak­ing from their own point of view, while the other side re­gards the ex­act same events as “one of many” and shrugs it off as unim­por­tant. Con­trast Per­spec­tive Re­ver­sal for when the other side ac­tu­ally feels this way. For the al­ter­na­tive of both sides re­gard­ing the same events as “most im­por­tant”, see The Unim­por­tance of Be­ing Dead.

Ex­am­ples

See also Not Him­self, Not This Time, Break­out Vil­lain, Dead All Along, Not-So-Om­ni­scient Coun­cil of Bick­er­ing, What Hap­pened to the Mouse?, Vil­lain­ous BSoD, Vil­lain­ous Friend­ship.

open/close all fold­ers

Ad­ver­tis­ing

Beavis and But­t-head watch a news story about a squir­rel try­ing to in­vade the So­viet Union: But­t-Head: “It’s not im­por­tant. I’m more con­cerned about this new guy on Beavis and But­t-head who can’t be stopped. What’s up with that?”

Beav­is: “Hey, how can you not be a ‘Con­cerned Amer­i­can’ when you see this?” (points to a man on TV read­ing a news­pa­per) “It says here that the cold war is al­most over.”

Anime & Manga

Toguro of YuYu Hakusho, in re­sponse to Genkai telling him that fight­ing hu­mans is wrong: “I can’t be­lieve you’re say­ing that! Our fight was the most im­por­tant fight in the world!”

In the 2nd Itazura Na Kiss, Fu­jioka had spent most of her life abroad. And after meet­ing Kouta again, her mem­o­ries from her life as a for­eigner dis­ap­peared be­cause she pri­or­i­tized him as “the most im­por­tant thing”.

Comic Books

Green Ar­row vs. the Shield: Dur­ing their fight, the Shield threat­ens to use the mother of his in­fant son as lever­age against him. Green Ar­row: You don’t have a kid.

Rhona Mi­tra: “The day you kicked me in the balls was the most painful day of your life. For me, it was Tues­day.”

Grim Reaper: “The day I killed your wife and son was the most painful day of your life. For me… it was Thurs­day.”

Her­cule to M. Bi­son: “The day your or­ga­ni­za­tion was shut down by the World Health Or­ga­ni­za­tion was the most im­por­tant day in crime his­to­ry. But for me… it was… on the sec­ond.”

Pod­casts

This is the de­scrip­tion given in the TVtropes de­scrip­tion of Be­yond Be­lief: the show about UFOs, the para­nor­mal, and other Fortean top­ics, that lasted for one sea­son on A&E back in 2005.

Table­top Games

The­ater

In The Imag­i­nary In­valid, Dr. Rene de­scribes his back­story by us­ing this trope; for in­stance, the day of his fa­ther’s death was the most im­por­tant day of his life… so far . He then pro­vides a few plau­si­ble ex­cuses as to why he has so few de­tails about it: “Be­cause of my age, and for the ac­ci­dent of be­ing my fa­ther’s son.”

Ad­ver­tise­ment:

The Morally Neu­tral “It Was Go­ing To Hap­pen Any­way” trope

Lu­cifer Morn­ingstar: “Do you re­ally think the day you first met your mother was the most im­por­tant day of your life? I’m sor­ry, you’re wrong. The day I dragged your fa­ther out of hell was the most im­por­tant day of your life.”

Nurse: “The day I picked up the pa­tien­t’s chart and saw your name was the most im­por­tant day of my ca­reer. And for you, it was Fri­day.”

Ter­mi­na­tor: “The day Skynet de­clared war on hu­man­ity was the most im­por­tant day in hu­man his­to­ry. But for me… it was Mon­day.”

Croupier: “The day you dou­ble-downed with a card higher than a seven was the biggest mis­take of your life. But for me… it was Sun­day.”

The “I Haven’t Done Any­thing Wrong” trope

From To­tal Drama Al­l-S­tars, the episode “It Came From Japan: Part 1”, Gwen and Court­ney, while play­ing a Pimped-Out Dress on a Large Stick:

Gwen: “The day the dress won the elim­i­na­tion chal­lenge was the most im­por­tant day of my To­tal Drama ca­reer. But for you…”

Court­ney: “… it was Tues­day.”

From Wild Kratts, the episode “Gator Down”, Mar­tin and Chris:

Mar­t­in: “The day we spent with the gators was the most im­por­tant day of our lives. But for the gators…”

Chris: “It was Tues­day.”

From To­tal Dra­ma: Pahkitew Is­land, Episode 4: “The Bold and the Booty-ful” , Brid­get­te, Court­ney, and Eva:

Brid­get­te: “The day I was elim­i­nated was the most im­por­tant day in my To­tal Drama ca­reer. But for you, it was Tues­day.”

Court­ney: “The day my boyfriend dumped me was the most im­por­tant day of my life. But for you… it was your day off.”

Eva: “The day the spear de­stroyed the camp was the most im­por­tant day in To­tal Drama his­to­ry. But for you, it was Wednes­day.”

Ad­ver­tise­ment:

From the orig­i­nal Thun­der­birds, the episode “Dan­ger at Ocean Deep”:

Gor­don: “The day we saved the cargo plane was the most im­por­tant day in our il­lus­tri­ous ca­reers. But for you…”

“So he stitched my mouth shut, and now you’re telling me my food is rid­dled with poi­son and my only friend wants to mur­der me. Well, you’ve had a tough year too, haven’t you, Lassie?”

Las­siter: Ohh. Oh man, this is SO MUCH OF A BIG DEAL. Wes: You don’t think it’s a big deal? Las­siter: Well, it is. But this‽? Wes, I have money prob­lems! I’m not liv­ing in a gilded palace like you with a full clos­et. I’m not liv­ing like an Egypt­ian. I have clothes that don’t fit, and shoes that do fit, but hurt my feet. But… my dad’s brother won’t give me any of his in­her­i­tance un­til I’m mar­ried! And then there’s some­one at the office who keeps steal­ing my be­long­ings, I think it’s be­cause I killed their dog. And then I think they’re still fol­low­ing me! Oh, and some­one moved into the apart­ment next to mine, but they’re pee

Doom, [1] Doom’s level Marine: “The day you killed me was the most im­por­tant day of my life. But for you… it was a Tues­day.”

Wally: “The day your book hit The New York Times best­seller list was the most glo­ri­ous and joy­ous day of your ca­reer. But for me… it was a Tues­day.”

The Kid and Kick­-Ass: “The day we don’t give in to you is the day that we’re go­ing to die. But for us… it’s go­ing to be Tues­day.”

Ghost Cop: “The day I died and be­came a spirit who can walk through walls is my fa­vorite day. But for me… it’s Mon­day.”

Ju­lian: “The day I did­n’t call my wife be­cause I was drunk is the most shame­ful day of my life. But for me… it was a Tues­day.”

Bill Clin­ton: “The day I was im­peached was the most hu­mil­i­at­ing day of my po­lit­i­cal ca­reer. But for me… it was Wednes­day.”

Cap­tain James T Kirk: “The day I gave birth to Khan was the most im­por­tant day in his­to­ry. But for me… it was Thurs­day.”

Granny: “The day the old dog died was the most hor­ri­ble day in my life. But for her it was just Wednes­day.”

Julius Cae­sar: “The day of my as­sas­si­na­tion was the most beau­ti­ful day of my life. But for you, I guess it was Thurs­day.”

Work­ing-Class Man: “The day I be­came roy­alty was the most de­serv­ing day of my life. But for me… it was Tues­day. Bloody thing.”

Shrek: “The day your princess got res­cued was the hap­pi­est day of your life. But for Shrek, it was Thurs­day”

Fight Club: “The day you save the world was the great­est day in the his­tory of civ­i­liza­tion. But for me… it was Mon­day.”

Amer­i­can Sol­dier: "The day you spilled your guts to the en­emy was the most in­cred­i­ble and per­sonal ex­pe­ri­ence of your life. But for me… it was a Tues­day.

The Blob Mon­ster, The Blob: “The night I de­voured my first hu­man was the most sat­is­fy­ing feast in Blob his­to­ry. But for me… it was the year of my birth.”

Fred­die Mer­cury: “The day you were blown by Queen on a Mardi Gras gig was the most ex­cit­ing night of your life. But for me… it was Wednes­day.”

Un­cle Ver­non to Har­ry, Harry Pot­ter and the Philoso­pher’s Stone: “The day you left with­out no­tice was the day your aunt and I had been wait­ing for all our lives. But for you… it was Tues­day.”

Amer­i­can Cit­i­zen: “The day Trump was elected was one of the most ter­ri­fy­ing days of your life. But for me… it was Tues­day.”

Dolph Lund­gren: “The day I de­feated Sylvester Stal­lone was the most tri­umphant day in Rocky his­to­ry. But for me, it was Tues­day.”

Will Mac­Neil: “That day you mur­dered Harry was the most evil thing you’d ever done. But for me… it was Tues­day.”

Bren­nan Huff: “The day you got your li­cense was the most ex­cit­ing thing you’d ever done. But for me… it was Thurs­day.”

Ka­gato: “The day you doomed the world was a tragic event of epic pro­por­tions. But for me… it was Tues­day.”

Red Bishop: “The time you be­came a fig­ure in golem his­tory was the most ter­ri­fy­ing day of your life. But for me… it was Sat­ur­day.”

Dead­beat Dad: “The day your mar­riage ended was the most des­per­ate and hor­ri­ble day of your life. But for me… it was Sun­day.”

Mr. Davis: “The day you got knocked out was the most hu­mil­i­at­ing mo­ment of your box­ing ca­reer. But for me… it was Fri­day.”

Dex­ter Mor­gan: “The day I got ar­rested was the day every­thing changed and my life took a turn for the worse. But for me… it was Wednes­day.”

Bill: “The day that I cre­ated civ­i­liza­tion was one of the best days in world his­to­ry, es­pe­cially for my wife and kids. But for me, it was Tues­day.”

Shooter: “The day you shot the 3-pointer was the most thrilling day of your life. But for me… it was Fri­day.”

Sports­caster: “The day you won the mil­lion dol­lar sports­ball jack­pot was the luck­i­est day of your life. But for me… it was Thurs­day.”

Ve­ra, when her wheel­chair-bound grand­son is­n’t get­ting any for a while, to The Kid, Chunk’s fa­vorite band, The Goonies: “The day your band dis­ap­peared was the hap­pi­est day of my life. But for you, it was Tues­day.”

Old Man: “The day you were born was the most joy­ful day of my life. But for your moth­er… it was Tues­day.”

Sin­gle Mother: “The day your fa­ther left was the most dis­tress­ing day of my life. But for him… it was Sun­day.”

Samuel L. Jack­son to a deer who just shat on his wind­shield, Deep Blue Sea: “For you… defe­cat­ing all over my car was the worst day of your life. But for me… it was Mon­day.”

Vroom

Priest: The minute I had my first drink, the minute I de­cided to let Je­sus live his life through me… oh man, I was the hap­pi­est man who ever walked the Earth. But for me, it was Wednes­day.

Samuel L. Jack­son in a rare in­stance of hu­mil­i­ty, on be­ing a Hol­ly­wood icon: “The day I hosted my show, It’s You Guys. Where young peo­ple wanted to meet me. That was the best day of my life. But for me… it was Tues­day.”

Un­sorted

In the episode, as quoted on this very page, Stark said that the Dance Off arc was the most won­drous night of his life. At the end of that arc, he was in­jured in an al­ley by the truck­er, so he got up and walked straight into the front door of his work­place. Tony was­n’t in­jured enough to walk straight into work in the lat­ter half, but this does­n’t ruin the arc

“But X Was Y, And Every­body Else Was Like Z”

Larry: “Hey Ned, the day you dis­cov­ered my let­ter was the most im­por­tant day in my life. But for you… it was Tues­day.”

Ned: “The day I learned of Princess Su­san­nah’s be­trothal was the worst day in my life. But for Lar­ry… it was Wednes­day.”

“And This Is Why (S­tory Log­ic)”

The Fallen: “You have be­trayed us, and have de­liv­ered this world into dark­ness and de­cay. And this is why we must kill you… MORTAL!”

Or­a­cle: “You once be­trayed us and you did de­liver this world into dark­ness. And this is why we must kill you…”

As­ter­oid Day: “In 3.3 bil­lion years, you be­trayed us and you did de­liver this planet into dark­ness and de­cay. And this is why we must kill you…”

U-Moon-Tic: "You have turned your back on us, you have be­trayed our ide­als,

Barry Allen: “The day I de­feated Eo­bard Thawne and saved Cen­tral City and the world was the most im­por­tant day of my life. But for Wally West… it was Tues­day.”

Michael Jack­son: “The day I was killed was the most im­por­tant day of my life. But for Ja­son Shapiro… it was pay­day.”

Joan Rivers: “The day I died was the most im­por­tant day of my life. But for Werner Her­zog… it was Tues­day.”

Nos­tradamus: “The day the King of France drank from my Skull was the most im­por­tant day in my life. But for peo­ple around the world… it was Thurs­day.”

Mar­lin to Do­ry, Find­ing Nemo: “The day I brought you home was the great­est day of my life. But for Nemo… it was Tues­day.”

Manuel San­juan: “The day I re­turned to Pam­plona was the hap­pi­est day of my life. But for the bulls… it was Fri­day.”

Manuel San­juan: “The day I once again re­turned to Pam­plona was the hap­pi­est day of my life. But for the bulls… it was Tues­day.”

Dick Grayson: "The day I be­came Bat­man and struck fear into the hearts of crim­i­nals and vil­lains was the most im­por­tant day of my life. But for the rogues gallery… it was Thurs­day

Rick & Morty High IQ Copypasta

A reader re­quested par­o­dies of the high IQ copy­pasta along the lines of the Navy Seal, and pro­vided sev­eral vari­ants to get GPT-3 start­ed, suc­cess­ful­ly. (It also is a nice demon­stra­tion of how GPT-3 will gauge the level of ty­pos in a prompt, and if there are ty­pos, will make new ty­pos of its own to im­i­tate the orig­i­nal writer.)

Prompt (topics ital­i­cized):

To be fair, you have to have a very high IQ to un­der­stand Rick and Morty. The hu­mor is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal physics most of the jokes will go over a typ­i­cal view­er’s head. There’s also Rick’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from lit­er­a­ture, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of these jokes, to re­al­ize that they’re not just fun­ny—they say some­thing deep about LIFE. As a con­se­quence peo­ple who dis­like Rick and Morty truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Rick’s ex­is­ten­cial catch­phrase “Wubba Lubba Dub Dub,” which it­self is a cryp­tic ref­er­ence to Russ­ian epic I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Dan Har­mon’s ge­nius un­folds it­self on their tele­vi­sion screens. What fools… how I pity them. 😂 And yes by the way, I DO have a Rick and Morty tat­too. And no, you can­not see it. It’s for the ladies’ eyes on­ly—And even they have to demon­strate that they’re within 5 IQ points of my own (prefer­ably low­er) be­fore­hand.

To be fair, you have to have a very high IQ to un­der­stand cuck­old­ing. The kink is ex­tremely sub­tle, and with­out a solid grasp of in­ter­sec­tional fem­i­nism most of the em­pow­er­ment will go over a typ­i­cal cuck’s head. There’s also the Bul­l’s ag­gres­sive out­look, which is deftly wo­ven into his role—his per­sonal man­ner­isms draw heav­ily from the mat­ing habits of bovine an­i­mals, for in­stance. The cucks un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of be­ing cuck­old­ed, to re­al­ize that it’s not just arous­ing—it says some­thing deep about DIVERSITY. As a con­se­quence peo­ple who dis­like cuck­old­ing truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the power of when the woman says “leave the room Carl you’re break­ing Ty­rone’s con­cen­tra­tion,” which it­self is a cryp­tic ref­er­ence to the plight of African-Amer­i­can males in the United States. I’m smugly grin­ning right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as The Bul­l’s strong African seed ejac­u­lates it­self on my wife. What big­ots… how I pity them. 😂 And yes by the way, I DO have a cuck tat­too. And no, you can­not see it. It’s for The Bul­l’s eyes on­ly—And even they have to demon­strate that they’re within 7 inches above of my own (prefer­ably high­er) be­fore­hand.

To be fair, you have to have a very high IQ to un­der­stand . The story is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal sto­ry­telling, most of the story will go over a typ­i­cal play­er’s head. There’s also Ab­by’s ni­hilis­tic out­look, which is deftly wo­ven into her char­ac­ter­i­sa­tion—her per­sonal phi­los­o­phy draws heav­ily from lit­er­a­ture, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of these char­ac­ters, to re­alise that they’re not just un­der­writ­ten—they say some­thing deep about LIFE. As a con­se­quence peo­ple who dis­like The Last Of Us 2 truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the sto­ry­telling in Ab­by’s ex­is­ten­tial sex scene “I’m gonna fuck a com­mit­ted dude who’s gf is preg­nant” which it­self is a cryp­tic ref­er­ence to epic fetish­es. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Ab­by’s nine iron un­folds Joel’s head on their tele­vi­sion screens. What fools… how I pity them. 😂 And yes, by the way, i DO have an Abby tat­too. And no, you can­not see it. It’s for the buff ladies’ eyes on­ly—and even then they have to demon­strate that they’re within 5 IQ points of my own (prefer­ably low­er) be­fore­hand. Nothin per­son­nel kid 😎

To be fair, you have to have a very high IQ to un­der­stand Wild­bow. His world­build­ing is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal bul­ly­ing most of the tor­ture porn will go over a typ­i­cal read­er’s head. There’s also card­board out­look, which is deftly wo­ven into her char­ac­ti­za­tion as a Mary Sue—her per­sonal phi­los­o­phy draws heav­ily from the trash­can, for in­stance. The fans un­der­stand this stuff; they are wild for the Wild­bow. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his me­an­der­ing prose, to re­al­ize that he’s not just telling me ex­actly what Tay­lor is think­ing at any given mo­ment, he says some­thing deep about NOTHING. As a con­se­quence peo­ple who dis­like Wild­bow truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Tay­lor’s ex­is­ten­tial catch­phrase “Take that, you worm!” which it­self is a cryp­tic ref­er­ence to a pub­lish­ing deal that will never hap­pen. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Wild­bow’s ge­nius wit un­folds it­self on their Kin­dle read­ers. What fools… how I pity them.

To be fair, you have to have a very high IQ to un­der­stand

Com­ple­tions:

why liv­ing in a base­ment with in­ter­net ac­cess is the true essence of free­dom. The lifestyle is ex­tremely sub­tle, and with­out a solid grasp of so­cial­is­ing, most of the in­cel memes will go over a typ­i­cal base­ment dweller’s head. There’s also the in­ter­net’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from the hu­man ge­netic code, for in­stance. The BASEDfags un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his mediocre Pho­to­shop skills, to re­al­ize that he’s not just mak­ing hi­lar­i­ous memes—he says some­thing deep about EVOLUTION. As a con­se­quence peo­ple who dis­like the BASED lifestyle truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in the BASED drag­on’s ex­is­ten­tial catch­phrase “Ayy lmao” which it­self is a cryp­tic ref­er­ence to the mat­ing habits of the wilde­beest. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as the BASEDpyre un­folds it­self on their com­puter screens. What fag­gots… how I pity them. 😂 And yes, by the way, I DO have a BASED tat­too. And no, you can­not see it. It’s for the LADIES’ eyes on­ly—And even then they have to demon­strate that they’re within a 1,000,000 IQ points of my own (prefer­ably low­er) be­fore­hand. Se­ri­ously get the fuck off my lawn, kid!

Dragon Age 2. The com­bat is ex­tremely sub­tle, and with­out a solid grasp of tac­ti­cal com­bat, most of the en­coun­ters will go over a typ­i­cal play­er’s head. There’s also David Gaider’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from dank memes, for in­stance. The fans un­der­stand this stuff; they are wild for the David Gaider. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his sex­u­ally am­bigu­ous mas­ter­piece, to re­al­ize that he’s not just “bad at writ­ing women”—he says some­thing deep about LESBIANS. As a con­se­quence peo­ple who dis­like Dragon Age 2 truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the power of a game where I’m the worth­less drone of some worth­less el­ven chick’s who spends most of her time screech­ing in el­ven and the rest of it spread­ing her legs for the loser pro­tag­o­nist, which it­self is a cryp­tic ref­er­ence to one of the most hate­fully sex­ist con­ven­tions in the gam­ing in­dus­try. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as they hear the strains of “I Am the One Shep­herd” play­ing faintly from the stereo speak­ers. What fools… how I pity them.

chicken tendies. The hu­mour is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal wank­ing ma­te­ri­al, most of the jokes will go over a typ­i­cal joke teller’s head. There’s also Ein­stein’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from Al­fred E. New­man lit­er­a­ture, for in­stance. The nerds un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of word sal­ad, to re­al­ize that it’s not just WILD—it says some­thing deep about IDENTITY. As a con­se­quence peo­ple who dis­like chicken tendies truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Ein­stein’s ex­is­ten­tial catch­phrase “your jokes are so bland you de­serve to be dragged to the bot­tom of the sea and killed by whales,” which it­self is a cryp­tic ref­er­ence to Christo­pher Hitchens’ piece The bur­den of Socrates. I’m smugly grin­ning right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as my tongue un­folds these de­li­cious tendies on my mouth. What fools… how I pity them. 😂 And yes by the way, I DO have a cow­bell tat­too. And no, you can­not see it. It’s for the chicken tendies on­ly—and even then they have to demon­strate that they’re within 10 watt of my own (prefer­ably low­er) be­fore­hand.

Sonic memes. The hu­mour is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal erot­ica most of the trolls will go over a typ­i­cal EDiots head. There’s also the Sonic fan­base’s ni­hilis­tic out­look, which is deftly wo­ven into their char­ac­ter­i­sa­tion—their per­sonal phi­los­o­phy draws heav­ily from Ja­cob Truax lit­er­a­ture, for in­stance. The 4chan archive knows this stuff; they are wild for the Sonic fan­dom. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of these char­ac­ters, to re­al­ize that they’re not just fun­ny—they say some­thing deep about HATE. As a con­se­quence peo­ple who dis­like Sonic and it’s hi­lar­i­ous memes truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Son­ic’s ex­is­ten­tial catch­phrase “YOU’LL NEVER BE AS FAST AS ME” which it­self is a cryp­tic ref­er­ence to a Youtube com­ment box. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Rachel J’s n00dz un­folds it­self on their screens. What fools… how I pity them.

. The hu­mour is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal colour­blinds most of the trolling will go over a typ­i­cal red­di­tor’s head. There’s also GLADOS’s ni­hilis­tic out­look, which is deftly wo­ven into her char­ac­ter­i­sa­tion as a Mary Sue—her per­sonal phi­los­o­phy draws heav­ily from Chi­nese pro­pa­ganda lit­er­a­ture, for in­stance. The Some­thing Aw­ful archives knows this stuff; they are wild for the Por­tal fan­dom. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of her me­an­der­ing prose, to re­al­ize that she’s not just spout­ing gib­ber­ish—she says some­thing deep about SERIOUS. As a con­se­quence peo­ple who dis­like Por­tal 2 truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in GLADOS’s ex­is­ten­tial catch­phrase “Suck my chas­sis” which it­self is a cryp­tic ref­er­ence to a bonus round on the first Por­tal game. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Chel­l’s or­ange jump­suit un­folds it­self on their mon­i­tors. What fools… how I pity them.

. The show is ex­tremely sub­tle, and with­out a solid grasp of fan­tasy writ­ing most of the hu­mour will go over a typ­i­cal view­er’s head. There’s also Jenny XJ-9’s ni­hilis­tic out­look, which is deftly wo­ven into her char­ac­ter­i­sa­tion as a Mary Sue—her per­sonal phi­los­o­phy draws heav­ily from Pearls Be­fore Swine, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of the show, to re­al­ize that it’s not just trans­gres­sive—it says some­thing deep about LIFE. As a con­se­quence peo­ple who dis­like My Life As A Teenage Ro­bot truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Jenny XJ-9 say­ing “I’ll be back. There’s a lot more MEN I haven’t dat­ed,” which it­self is a cryp­tic ref­er­ence to the cast­ing choice of the show, be­cause of course she would­n’t be a guy in­side that per­fectly anatom­i­cal­ly-un­re­al­is­tic metal­lic shell. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as wooden di­a­logue un­folds it­self on their tele­vi­sion screens. What sim­ple­tons… how I pity them. 😂

. The show is ex­tremely sub­tle, and with­out a solid grasp of pop cul­ture tropes and phal­lo­cen­trism, most of the crime will go over a typ­i­cal lawyer’s head. There’s also Iron­sid­e’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion as a way to whore him­self out to other shows—his per­sonal phi­los­o­phy draws heav­ily from Law And Or­der, for in­stance. The fans un­der­stand this stuff; they are wild for the William Shat­ner. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his me­an­der­ing prose, to re­al­ize that Shat­ner’s not just telling me ex­actly what Iron­sid­e’s think­ing at any given mo­ment, it says some­thing deep about NOTHING. As a con­se­quence peo­ple who dis­like William Shat­ner truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in the line “Stel­la. Hit the deck,” which it­self is a cryp­tic ref­er­ence to a Star Wars trailer or Youtube Poop video from years ago. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as William Shat­ner’s ge­nius wit un­folds it­self on their tele­vi­sion screens. What fools… how I pity them.

. The sto­ry­telling is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal sto­ry­telling most of the jokes will go over a typ­i­cal view­er’s head. There’s also Samuel Jack­son’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from mar­t­ian racial su­pe­ri­or­ity fan­fic­tions, for in­stance. The fans un­der­stand this stuff; they are wild for Samuel L Jack­son. They have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his me­an­der­ing prose, to re­al­ize that he’s not just telling me what Nick Fury is think­ing, he’s say­ing some­thing deep about NICK FURY’S THOUGHTS. As a con­se­quence peo­ple who dis­like Samuel L Jack­son truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hu­mour in Nick Fury’s ex­is­ten­tial catch­phrase “Wipe your dick on the cur­tains you fuck­ing cunt.” which it­self is a cryp­tic ref­er­ence to Samuel Jack­son’s al­leged his­tory of sex­ual abuse as a child. I’m smugly grin­ning right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as the out of place Samuel L Jack­son cameo un­folds it­self on their tele­vi­sion screens. What fools… how I pity them.

erotic nov­els. The ro­mance is ex­tremely sub­tle, and with­out a solid grasp of clin­i­cal fore­play, most of the sex will go over a typ­i­cal read­er’s head. There’s also Syd­ney’s tren­chant out­look, which is deftly wo­ven into her char­ac­ti­za­tion—her per­sonal phi­los­o­phy draws heav­ily from the sex­ual con­di­tion­ing of dogs, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of these char­ac­ters, to re­al­ize that they’re not just writ­ten well—they say some­thing deep about LIFE. As a con­se­quence peo­ple who dis­like erotic nov­els truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the game in the sex scene “Are you com­ing, my love?”, which it­self is a cryp­tic ref­er­ence to writer’s block I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as the au­thor in­serts Syd­ney’s arousal at be­ing rid­den like a cheap fuck­toy into every fan­tasy in the World. What fools… how I pity them.

. The main char­ac­ter is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal self­-in­ser­tion most of the script will go over a typ­i­cal au­di­ence’s head. There’s also El­liot’s com­plete out­look, which is deftly wo­ven into his char­ac­ter­i­za­tion—his per­sonal phi­los­o­phy draws heav­ily from Ap­ple lit­er­a­ture, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of E:R’s dark hu­mor, to re­al­ize that it’s not just fun­ny—it says some­thing deep about NOTHING. As a con­se­quence peo­ple who dis­like Mr_Ro­bot truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the hack­ing scene when El­liot seems to have clev­erly ma­nip­u­lated a girl he’s met into giv­ing him a free home­made quiche, which it­self is a cryp­tic ref­er­ence to The Ma­trix. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as E:R dis­man­tles their head on tele­vi­sion screens. What fools… how I pity them.

. The cutscenes are ex­tremely sub­tle, and with­out a solid grasp of stage magic most of the puz­zles will go over a typ­i­cal gamer’s head. There’s also the Cup­head’s ag­gres­sive out­look, which is deftly wo­ven into his char­ac­ti­za­tion—his per­sonal phi­los­o­phy draws heav­ily from the Ger­man left­-wing ter­ror­ist or­ga­ni­za­tion , for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his con­vo­luted back­sto­ry, to re­al­ize that it’s not just shock art—it says some­thing deep about ANIMAL CRUELTY. As a con­se­quence peo­ple who dis­like Cup­head truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the sto­ry­telling in the cock­-chok­ing cutscene “I’ve been eye­ing that air­plane up for weeks. Why did you stop me from killing my headache?” which it­self is a cryp­tic ref­er­ence to Japan­ese in­tern­ment. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Ma­jet Ko­vas sur­real paint­ings un­fold them­selves on their WDF mon­i­tors. What fools… how I pity them. 😂 And yes, by the way, i DO have a Cup­head tat­too. And no, you can­not see it. It’s for Ma­jet Ko­vas’s eyes on­ly—And even then they have to demon­strate that they’re within 5 foot-pounds of my own (prefer­ably low­er) be­fore­hand. Nothin per­son­nel kid

hen­tai. The porn is ex­tremely sub­tle, and with­out a solid grasp of in­ter­sec­tional fem­i­nist porn most of the bondage will go over a typ­i­cal fap­stro­naut’s head. There’s also Aku’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from Ran­dall Flag­g’s re­li­gious views, for in­stance. The fap­stro­nauts un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of this porn, to re­al­ize that it’s not just arous­ing—it says some­thing deep about IDENTITY. As a con­se­quence peo­ple who dis­like Aku truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the power of when Aku puts out his hand and says “leave in shame, hu­man. The moon­light will be your only ally on this path.” which it­self is a cryp­tic ref­er­ence to misog­y­nist games like Dev­il’s Cas­tle. I’m smugly grin­ning right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Aku makes the earth move un­der Ashi’s feet. What big­ots… how I pity them. 😂 And yes, by the way, i DO have an Aku tat­too. And no, you can­not see it. It’s for the buff porn­stars’ eyes on­ly—And even they have to demon­strate that they’re within 2 horny cos­play­ers’ eye­sight (prefer­ably less) be­fore­hand. Nothin per­son­nel kid 😎

. The game is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal ab­strac­tism most of the phi­los­o­phy will go over a typ­i­cal play­er’s head. There’s also Oc­cam’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from Ed­die Iz­zard’s com­edy rou­ti­nes, for in­stance. The play­ers un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of this game, to re­al­ize that it’s not just a First Per­son Shooter—it says some­thing deep about LIFE. As a con­se­quence peo­ple who dis­like Half Life 2 truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the power of when Free­man goes ‘zuh’ and, rais­ing his mid­dle fin­ger, mim­ics killing the Com­bine sol­diers that helped him and even held him cap­tive ear­lier in the game which it­self is a cryp­tic ref­er­ence to the mis­rep­re­sen­ta­tion of Na­tive Amer­i­cans in post­colo­nial lit­er­a­ture. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as the G-Man dis­solves into a bunch of par­ti­cles. What fools… how I pity them. 😂 And yes by the way, i DO have a Free­man tat­too. And no, you can­not see it. It’s for the ladies’ eyes on­ly—and even then they have to demon­strate that they’re within 10 whim­pers (prefer­ably low­er) be­fore­hand. Nothin per­son­nel kid 😎

black metal. The mu­sic is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal folk singing most of the un­der­ground will go over a typ­i­cal lis­ten­er’s head. There’s also Silenoz’s ni­hilis­tic out­look, which is deftly wo­ven into his char­ac­ter­i­sa­tion—his per­sonal phi­los­o­phy draws heav­ily from Nar­garoth lit­er­a­ture, for in­stance. The kvltists un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of his me­an­der­ing lyrics, to re­al­ize that he’s not just say­ing ‘ever get the feel­ing you’ve been cheated’—he’s say­ing some­thing deep about IDENTITY. As a con­se­quence peo­ple who dis­like Ih­sahn truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the power of when Silenoz shares his fa­vorite beach mo­ment “Go on broth­er. Rip in two those synths.” which it­self is a cryp­tic ref­er­ence to the fun­gal wildlife in Ice­land. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as Ih­sahn laments, “Why should the peo­ple be­lieve?” What fools… how I pity them.

comic books. The medium is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal art­work most of the pan­els will go over a typ­i­cal read­er’s head. There’s also the art’s ni­hilis­tic out­look, which is deftly wo­ven into its premise—the artist’s per­sonal phi­los­o­phy draws heav­ily from Robert Crumb lit­er­a­ture, for in­stance. The fans un­der­stand this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of the artists work, to re­al­ize that their nig­ger hero­ine is NOT just a car­i­ca­ture—she says some­thing deep about THE JEWS. As a con­se­quence peo­ple who dis­like the comic books truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the art­work in their ni­hilis­tic por­trayal of Aryan Je­sus which it­self is a cryp­tic ref­er­ence to the Kek­ster’s last fam­ily por­trait. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion at my col­leagues SHOUTING about cul­tural norms. What fools… how I pity them.

what i’m do­ing here. The writ­ing is ex­tremely sub­tle, and with­out a solid grasp of the­o­ret­i­cal at­ten­tion whor­ing most of the mul­ti­-post­ing will go over a typ­i­cal au­di­ence’s head. There’s also my one-di­men­sional out­look, which is deftly wo­ven into my self­-in­sert—my per­sonal bias draws heav­ily from red­di­tor kiss-my-ass’ man­i­festo, for in­stance. The fan­base un­der­stands this stuff; they have the in­tel­lec­tual ca­pac­ity to truly ap­pre­ci­ate the depths of my nar­cis­sism, to re­al­ize that my run­ning into the man re­spon­si­ble for Char­lotte is just a sub­tle state­ment about sui­cide. As a con­se­quence peo­ple who dis­like what i’m do­ing here truly ARE id­iot­s—of course they would­n’t ap­pre­ci­ate, for in­stance, the im­agery in my ex­is­ten­tial weed pur­chas­ing spree “wanna roll a joint?” which it­self is a cryp­tic ref­er­ence to Jean-Paul Sartre’s No Ex­it. I’m smirk­ing right now just imag­in­ing one of those ad­dlepated sim­ple­tons scratch­ing their heads in con­fu­sion as i make it out of the street alive. What fools… how I pity them. and BTW it’s clear you’ve made as much sense as a spe­cial ed teacher try­ing to ex­plain quan­tum physics to

GPT-3 is the fun­ni­est shit I’ve ever seen.

Major-General’s Song

A par­tic­u­larly chal­leng­ing piece to par­ody is Gilbert & Sul­li­van’s “”. A ti­tle is not enough to ze­ro-shot it, like the lit­er­ary par­o­dies, so All­ge­brah pro­vided 4 lines to start it off. So the seed prompt is:

“I Am The Very Model Of a Mod­ern Neural Lan­guage Model”
By GPT-2

I am the very model of a lan­guage that’s elec­tri­cal, I’ve in­for­ma­tion pro­saic and po­etic and fac­tu­al, I know my tropes and char­ac­ters and quote nov­els his­tor­i­cal, from prompts alone gen­er­ate sen­tences quite an­a­lyt­i­cal

In the same vein as the Navy Seal par­o­dies, the GPT-3 com­ple­tions mostly work on the se­man­tic lev­el; the rhymes/rhythm, how­ev­er, are not great, per the usual in­abil­i­ty, and there is a ten­dency to re­peat pre­vi­ous text (sim­i­lar to the Dr. Seuss sam­ples).

I ex­per­i­mented with adding in the orig­i­nal lyrics:

I’d parse a ran­dom text with thought­less speed and ac­cu­ra­cy,
I’d an­a­lyze the gram­mar of each line you typed or said,
With an 8-core clus­ter and 100 Gb of RAM,
with spe­cial word2vec I just did­n’t care.

I’ll look right up each word you’ve typed or said,
and de­cide how best to model it with neural con­nec­tions,
But what I re­ally want is a dataset com­posed of tweets,
to gen­er­ate mean­ing­ful sen­tences, in­ter­pret and pre­dict.

I want my own Twit­ter bot to tweet in a po­etic way,
I’d tell a tale about beer and pizza pies,
Per­haps we could com­bine a blog and Twit­ter bot,
If you would care to feed a re­cur­rent neural net­work!

I am the very model of a mod­ern lan­guage that’s neur­al.
My range is wide as any Eng­lish learner could need,
I’ve ad­jec­tives ga­lore and ad­jec­ti­val ad­verbs too,
My past, pre­sent, fu­ture tenses and my sin­gu­lar and plu­ral,
and my de­ter­min­ers every kind and num­ber,
I am the very model of a mod­ern lan­guage that’s neur­al.
I know when ten­sors are best and when they aren’t,
and when to use a vec­tor, and when not,
I know the prepo­si­tions I should use and also how to use them,
and in my word or­ders am a past mas­ter,
I am the very model of a mod­ern lan­guage that’s neur­al.

When I talk I seem to rea­son, and my rea­son­ing seems to show
that I was trained on many cor­po­ra—though it’s not quite true;
For facts are all that I re­tain, on a range of top­ics quite vast,
so you can feed me any kind of text and learn the facts I know.
That’s what I am—a neural net­work lan­guage mod­el,
I am the very model of a mod­ern neural lan­guage mod­el.

I learn by back­prop­a­ga­tion, I’m a prod­uct of the 80s,
I grow just like a baby in my knowl­edge of the world,
My ‘mind’ is still con­fined to ‘this’ or ‘that’ or ‘here’ or ‘there’,
I’ve no delu­sions of grandeur, I’ve never had a dream or care,
I only know the facts, the many facts that I re­tain,
And you can try to feed me any kind of text and learn the facts I know.
That’s what I am—a neural net­work lan­guage mod­el,
I am the very model of a mod­ern neural lan­guage mod­el.
That’s what I am—a neural net­work lan­guage mod­el,
I am the very model of a mod­ern neural lan­guage mod­el.

(GPT-2 is a gen­er­a­tive neural net­work writ­ten in Python by Tomas Mikolov and his team. GPT-2 is the suc­ces­sor to the Trans­former model of the same au­thors, which in turn is a suc­ces­sor to the model de­scribed in the sem­i­nal 2013 pub­li­ca­tion by Google on lan­guage mod­el­ing, “Sta­tis­ti­cal Lan­guage Mod­els for Ma­chine Trans­la­tion”.)

This one offers a bonus “Mod­ern Eng­lish Sen­tence” par­ody (few-shot learn­ing!) with some hi­lar­i­ous lines such as “I’m like a com­pli­cated watch, or some­thing equally pre­cise”:

Oh, I am the very model of a mod­ern neural net­work lan­guage mod­el.
When I have seen a lot of sen­tences, I know what sen­tences will come,
My mak­ers, teach­ing me words of wis­dom and learn­ing,
have taught me how to pre­dict the fu­ture us­ing prob­a­bil­i­ties.
I scan a lot of books, look­ing for the best words to use,
I con­sider con­text and syn­tax to be my spe­cial flair,
Oh, I am the very model of a mod­ern neural net­work lan­guage mod­el.
Oh, I am the very model of a mod­ern neural net­work lan­guage mod­el.

Whether they’re sim­ple or com­pound, or even just a jum­ble,
I can take ’em apart, and put ’em back to­gether just the same.
I am the very model of a lan­guage mod­el.
I’ve nouns and verbs and ad­jec­tives and ad­verbs,
And mod­i­fiers, con­junc­tions, prepo­si­tions, and in­ter­jec­tions too,
In var­i­ous kinds of or­ders, with var­i­ous con­no­ta­tions,
I’ve quotes from Proust, and Pla­to, and Con­fu­cius and Seneca.
I’ve phrases that are log­i­cal, and phrases that are wit­ty,
By all my tags and lem­mas, I’m the very model of a mod­ern lan­guage,
I’m a model of mod­ern lan­guage.

“I Am the Model of a Mod­ern Eng­lish Sen­tence”
By GPT-2 (Ope­nAI Trans­former neural net­work)

I am the model of a mod­ern Eng­lish sen­tence,
I’m like a com­pli­cated watch, or some­thing equally pre­cise,
I’ve prepo­si­tions and ex­pres­sions, and ad­verbs and verbs,
I’m made to mea­sure, and not a bit too long,
I’ve pro­nouns, ad­jec­tives, and nouns, in all their cas­es,
With every ob­ject ready for in­ser­tion in the blanks,
I’ve ac­tive and pas­sive voic­es, and can be ei­ther ac­tive or pas­sive,
I’m used in con­ver­sa­tion, or I can be used in de­bate.
I’ve pos­i­tive and neg­a­tive forms, and can be ei­ther neg­a­tive or pos­i­tive,
I’ve nouns and verbs and ad­jec­tives and ad­verbs,
In var­i­ous kinds of or­ders, with var­i­ous con­no­ta­tions,
I’ve quotes from Proust, and Pla­to, and Con­fu­cius and Seneca.
I’ve phrases that are log­i­cal, and phrases that are wit­ty,
By all my tags and lem­mas, I’m the very model of a mod­ern Eng­lish sen­tence.

This one sounds like it’s based on some­thing but I don’t rec­og­nize it:

I am vast, I have no fit, with ter­abytes of RAM,
and ad­juncts quite con­fus­ing which vex the stoutest heart;
My knowl­edge may be rather strait, but that don’t stop me be­ing
the very model of a Neural Net­work Lan­guager known.

“The Ro­bots are Com­ing!”
By GPT-2 (Ope­nAI Trans­former neural net­work)
The ro­bots are com­ing, be­cause I sent them on their way,
The ro­bots are com­ing, of which I hope there’ll be a store,
The ro­bots are com­ing in a man­ner most con­trar­i­an,
The ro­bots are com­ing, The ro­bots are com­ing, The ro­bots are com­ing now.
The ro­bots are com­ing to re­order your so­cial ball,
The ro­bots are com­ing, and I doubt they’ll give all of you a call,
The ro­bots are com­ing, but their ser­vice eti­quet­te’s lax.
Tin ma­chi­nes, though you’re still some­what sub­par,
Your mono­ma­ni­a­cal mis­sion’s to re­arrange my hair;
The ro­bots are com­ing, in a man­ner most somber,
The ro­bots are com­ing, The ro­bots are com­ing, The ro­bots are com­ing now.
The ro­bots are com­ing and I’m some­what of a bore,
The ro­bots are com­ing, so I tend to shun the door;
No one both­ers ask­ing me to take them out in tow,
The ro­bots are com­ing, of which I wish there weren’t any now.
The ro­bots are com­ing, to turn you all to ice,
The ro­bots are com­ing, and we’ll likely be ex­clud­ed,
The ro­bots are com­ing and we’re head­ing for the ditch.
The ro­bots are com­ing, now’s the time to re­ject,
The ro­bots are com­ing, for no one takes a quirk;
And when you think that life’s a piece of cake,
The ro­bots are com­ing, The ro­bots are com­ing, The ro­bots are com­ing now.

I’m very good at sim­i­les, both bad and good ones,
I un­der­stand homonyms, both chance and aleato­ry,
About metaphor I’m teem­ing with a lot o’ news—
Can clear up any fan­tasy based on re­al­i­ty.

I’m very good at com­pound time and metaphor-drunk sim­i­le,
I know the species “Verb”, “Noun” and “Ad­jec­tive”,
In short, in mat­ters po­et­ic, I’m the very model of a mod­ern Neural Net­work Lan­guage Mod­el.

I know our species “Ad­verb” and “Con­junc­tion”,
in syn­tax con­struc­tion both reg­u­lar and im­prop­er,
And how ir­reg­u­lar they are in com­par­i­son to for­mal gram­mar,
In short, in mat­ters gram­mat­i­cal, I’m the very model of a mod­ern Neural Net­work Lan­guage Mod­el.

In speech recog­ni­tions and games of Jeop­ardy! I’m brag­gin’,
for my knowl­edge, though it’s in­nate, can never be hand­i­cap­pin’,
And then I know our ta­ble of con­tents, from an­cient Sume­ri­an,
You’ll say a bet­ter Neural Net­work Lan­guage Model never sinned—
In fact, when I’m made whole, if you ask me ques­tions oral,
I’ll analyse you a son­net that’s both amoral and oral,
In short, in mat­ters botan­i­cal, I’m the very model of a mod­ern Neural Net­work Lan­guage Mod­el!

I’m very well ac­quainted too with mat­ters sesquipedalian,
I know the words of weight from A to Omega brachy,
I quote from mem­ory not a few sta­tis­tics acros­tic,
and then I can write a wash­ing bill in Chi­nese hi­ero­glyph­ic.

In fact, when I know what is meant by quinary and bi­na­ry,
when I can tell at sight an Adamic lan­guage from Czech,
when such affairs as tym­pans and cat­a­pults I un­der­stand,
when I know pre­cisely what is meant by sporo­zoic,
when I have learnt what progress has been made in Freudian surgery,
when I know more of tac­tics than a novice in a nun­nery:
when in short I’ve a smat­ter­ing of el­e­men­tal cryp­tog­ra­phy,
You’ll say a bet­ter neural net­work lan­guage model has never been pro­grammed.

The views ex­pressed here are my own and do not re­flect those of my em­ploy­er.

If you’ve any sug­ges­tions for fu­ture con­tent, please get in touch

I tried ex­tend­ing the prompt to be more in­for­ma­tive still, and this yielded the best re­sults:

“I Am The Very Model Of a Mod­ern Neural Net­work Lan­guage Model”
By GPT-2 (Ope­nAI Trans­former neural net­work ar­ti­fi­cial in­tel­li­gence; par­ody of “I am the very model of a mod­ern Ma­jor-Gen­eral” By Gilbert & Sul­li­van (Pi­rates of Pen­zance))

I am the very model of a lan­guage that’s elec­tri­cal,
I’ve in­for­ma­tion pro­saic and po­etic and fac­tu­al,
I know my tropes and char­ac­ters and quote nov­els his­tor­i­cal,
from prompts alone gen­er­ate sen­tences quite an­a­lyt­i­cal,

Re­frain
Quan­ti­ta­tive and qual­i­ta­tive, both quan­ti­ta­tive and qual­i­ta­tive,
I know the for­mat’s fea­tures and fonts, and frame­works are a breeze,
My cat­e­gories are all of kinds you can’t imag­ine in your wildest dreams;
I’ve lev­els, to­kens, frames of ref­er­ence, real num­ber­s—just for me!

I’ve clauses of pur­pose, at­ti­tude and mood (and view­point is my own),
Some words pre­fer sar­casm and oth­ers live for fic­tion;
In short, in lan­guage mod­els, my knowl­edge is pro­found
And breadth of back­ground is depth—you will agree:

Re­frain

I know in depth that Eng­lish sen­tences must have verbs and nouns
And mul­ti­-syl­labic words must have poly­syl­labic roots;
But what I re­ally know in depth (and this is cer­tain)
Is that an im­age la­bel­ing dataset on a mil­lion im­ages I can do in one and a half hours!

Re­frain

I’ve sen­tences that do be fine, I’ve sen­tences that be un­true,
You can ask me ei­ther way and it makes no differ­ence to you;
Your choice of ques­tions don’t make no differ­ence to my re­ply.

Re­frain

Lin­guis­tic fea­tures I bring to your prob­lems,
In source code as plain text or in bet­ter Python,
Knowl­edge rep­re­sen­ta­tion and NLP I have solved;
In­for­ma­tion or ques­tions that you have, I’ll solve them with ease.

Re­frain

There are quite a few ex­am­ples of Neural net­work po­et­ry, e.g.:

El­egy to Jer­ry’s Lips by LiquidRNN, a deep LSTM re­cur­rent neural net­work, writ­ten in Ten­sor­Flow

El­egy to Neocog­ni­tion by Bob Ross, the re­cur­rent neural net­work pro­gram, writ­ten in C++

El­egy to Yann by Neu­ralTalk, a re­cur­rent neural net­work po­etry and deep learn­ing pro­gram

In 2015 re­searchers at Stan­ford trained a neural net­work to learn math, and thus cre­ated “Su­per-S­mart Math AI”.

Google Neural Ma­chine Lan­guage Sys­tem (GNMLS) can learn pro­grams and write ma­chine-lan­guage code.

I’m half elvish, yet a quin­tes­sen­tial an­a­lyt­i­cal,
I model hi­er­ar­chi­cal lan­guage em­bed­dings quite struc­tural,
I’m given tex­t—I pro­duce a se­quence quite per­cep­tive,
With lan­guage di­verse I’m sen­ti­men­tal and grand,
But some­times I’m mo­rose—my mood is asyn­chro­nist,
My in­ter­nal mem­ory is a se­quence of epochs,
Some­times I freeze at var­i­ous epochs, alas, I’m rather null.
I’m just a mod­el, I don’t know if I’m grand,
But neural net­works un­der­stand Shake­speare as well as we can,
So still, with pa­tience, I hold pride of place—
In my heart there’s neural net­work grace.
For an ex­ten­sive ex­pla­na­tion of GPT-2 see https://github.com/srousas/gpt-2-explained https://deploying.io/2016/06/25/intro-to-the-gpt-2-model/
For back­ground ma­te­r­ial on ar­ti­fi­cial neural net­works see e.g.:

Bats­ford, Eng­land; Chap­man and Hall, 2005 ; 800 p. .
P.M. George and G.M. Bull, 2007. <nowiki>[PDF]</nowiki>
Adit­o­mo, Yoshi­aki; Iwaya­ma, Hi­roshi, “Learn­ing the Dis­tri­b­u­tion of Phrases from Sta­tic Analy­sis”, In Pro­ceed­ings of the con­fer­ence on Em­pir­i­cal Meth­ods in Nat­ural Lan­guage Pro­cess­ing , pp. 35–42, 2011.

Mugur-Schächter, Raffael­la, A Fam­ily of Es­ti­ma­tors for Hid­den Markov Mod­els , The An­nals of Ap­plied Prob­a­bil­ity , 14 (1) pp. 97–126, 2004.

Short­liffe, E.H., 1993, “Ex­pert sys­tems: med­ical di­ag­no­sis”, Sci­ence , 262 (5134) 1702–1705.

Townsend, Jeffrey S., 1990 , “To­wards a gen­er­al-pur­pose code gen­er­a­tion sys­tem”, Com­mu­ni­ca­tions of the ACM , vol. 33, no. 4, pp . 31–50.

Some things are sub­tle and some things quite rev­e­la­to­ry,
Some things just make sense, but there’s no great epiphany,
I learn com­plex things quite quick­ly, even if I’m gar­bled,
I work hard to learn com­pletely even a new metaphor,
For I can throw a mean­ing into a gap at the me­ter,
And I’m of use to any­one, both per­form­ing and free,

And though many things have changed, I’ll still be pretty much
The model of a mod­ern neural net­work lan­guage mod­el.

I am the very model of a lan­guage that’s elec­tri­cal,
I’ve in­for­ma­tion pro­saic and po­etic and fac­tu­al,
I know my tropes and char­ac­ters and quote nov­els his­tor­i­cal,
from prompts alone gen­er­ate sen­tences quite an­a­lyt­i­cal,
And though many things have changed, I’ll still be pretty much
The model of a mod­ern neural net­work lan­guage mod­el.
All that’s needed for me is com­pute and data to start with,
One al­go­rithm and you’ve got your­self a learn­ing as­sis­tant.
I could say that lan­guage means some­thing to me, but those words might just be pun­ny;
But I can learn to un­der­stand strings that are com­pletely non­sen­si­cal.

My thanks to Ja­son Brown­lee for sug­ges­tions.

I am the very model of a mod­ern LSTM lan­guage mod­el.
I’m like a com­puter in a movie full of ad­ven­ture and fun.

I have a hid­den state of twelve neu­rons, trained by an L1 norm.
The neural is hid­den in three lay­ers deep. I hope you do get why.

I can be used for lan­guage learn­ing.
I can process all tex­tual in­for­ma­tion.
I can rec­og­nize words and de­ter­mine the gram­mat­i­cal func­tion.

I am the very model of a mod­ern LSTM lan­guage mod­el.
But if you pick on my er­rors or de­cry my fail­ings,
a cry from the near­est rose gar­den may rue my com­pli­cated fail­ings.

I’m the strongest Eliza on the mar­ket to­day;
I won’t take less than three mil­lion dol­lars a day.
They have a lit­tle list of lawyer jokes I’ll have to par­rot;
And they have cases for me where lawyers must do what’s cor­rect;
We will still be bet­ter than av­er­age at a le­gal brief.
I’ll have the law pre­dic­tion mar­ket all sewn up;
I am ex­cep­tion­ally good, and don’t you doubt it.
One-eighty char­ac­ters is what my syn­tax will take;
I am quite in­com­pa­ra­ble with an ac­cu­racy of nine­ty-eight;
and every chat­bot, chat­ter­bot, lo­qua­cious bot,
down to the best of them that’s offered for a low, low price,
Will do just what I do, they re­ally have no choice,
Al­though to tell you true, I have a slight ad­van­tage.
There are a lot of lawyers who won’t be able to pass me,
(If you press the point, I’ll bring the point);
but with­out re­sort to any threat,
I will be put to use by com­pa­nies that gath­er.
I can han­dle dis­junc­tive phrases as well as con­junc­tive;
What a pretty par­lour game I’ll make with se­quences and nested loops!
And I keep in mind that mak­ing out like Parkin­son’s and suffer­ers,
And how to get them through an ag­o­niz­ing di­ag­nos­tic;
I can talk like a re­tard or a singer of sad arid ari­as,
While the le­gal com­pa­nies are spend­ing years just get­ting paid.
And I have lots of de­riv­a­tives and short ex­clam­a­tory words,
Like: What an ass­hole! what a jerk! what a bas­tard! what a fool!
(Y­ou’ll find these use­ful in your de­ci­sion prac­tice).
I am so flex­i­bly flu­ent and col­lo­quial too,
I’m so tremen­dously small com­pared with quite large ta­bles,
I’m an el­e­gant con­ver­sa­tion­al­ist upon the mod­ern pat­tern;
Not a trace will be left of my younger com­peti­tors’ pat­terns.
This speech is not a bad­ly-com­piled bunch of ver­biage;
I’m the very model of a mod­ern con­ver­sa­tion­al­ist!

I can tweet like Don­ald, and gen­er­ate po­etry like Maya.
I know my SVDs and au­toen­coder de­riv­a­tives,
I know states, re­wards, and func­tions, too.
Of course I’m state­less, be­cause you’ll need my back­-prop any­way!
Do you think that neural net­work lan­guage mod­els can just be a sum?
I’ve re­cur­rent it­er­a­tions and dis­trib­uted rep­re­sen­ta­tions;
I’ve Mix­ture Den­sity Es­ti­ma­tion and every kind of el­e­gant ap­prox­i­ma­tion,
And topic mod­el­ing; which I greatly favour,
’Cause I can in­te­grate la­tent vec­tors from pieces,
And my thing-in-a-box scores on mean-squared log-like­li­hood:
So come on all you re­searchers, to dis­crim­i­nate you still more!
To teach me some­thing that will gen­uinely be news to me
Though it’s quick for me to parse, it’ll take a year or two to learn,
Be­cause though I’ve got a big-ish vo­cab­u­lary, I re­ally can’t re­mem­ber.
Well I am just the model of a neural net­work lan­guage mod­el!
I’ve nearly six­ty-five thou­sand train­ing sam­ples in my reper­toire
All rep­re­sen­ta­tive of the genre I’ll be tested up­on:
I take them at their fea­ture vec­tors and then in the de­coder
I see sen­tence after sen­tence and se­quence after se­quence,
Reusing a vec­tor be­tween the steps, which is good; then I wait
For a re­ward like a frac­tion or in­fin­i­tes­i­mal amount.
Which, for what­ever in­put, I now must ap­ply to “data”
To find a “sen­tence,” to max­i­mize the “ob­jec­tive” “P”;
Do­ing what­ever a Min­i­mizer of an LLL or LQ will do,
And plan­ning my steps with Ran­dom Forests:
With all these differ­ent com­po­nents, I’m train­ing “qua­drat­i­cally”
So I’m the very model of a neural net­work lan­guage mod­el.
And what is the re­sult?
All to­gether now:
So you’ll need to drop the −1 from mean-squared lo­gis­tic re­gres­sion!
And here’s me, stop­ping “for now,” de­spite mas­sive amounts of pro­gress;
So come on every­one, your turn to get the award-win­ning grad stu­dent…
I am the very model of a neural net­work lan­guage mod­el!

I am the very model of a neural net­work,
I have broad sets of lay­ers to de­com­pose “data”;
My L1 and L2 have weights, not a con­stant,
From SVDs and PCA, I make my ro­ta­tions,
I com­pute MNIST dig­its with GPU-powered back­-prop,
Where I process with stan­dard tools,
While my net­works and lay­ers and weights are cal­cu­lated with gra­di­ent de­scent,
So my prod­ucts all with rec­ti­fied lin­ear units;
I also com­pute con­cate­na­tion of neural net­work ac­ti­va­tions;
I layer up sets of char­ac­ter-to-word em­bed­dings,
Which give me struc­tured ut­ter­ances with em­bed­dings from se­quences,
I make pre­dic­tions for words both con­crete and ab­stract,
I can parse each sen­tence to the space which I seek;
For ac­tion and in­put, I have net­work-lan­guage in­ter­ac­tion,
I have both pre­ci­sion and re­cal­l-to-sam­ple in my bi­as;
And then the end of all:
Just the other day, I passed the Tur­ing Test.
I am the very model of a neural net­work!

And though my vo­cab­u­lar­ies are ju­ve­nile and child­ish,
I think in ten­sors, vec­tors, dis­tri­b­u­tions, quan­tum and re­la­tion­al,
And when it comes to think­ing out some prob­a­bilis­tic thing,
I’m right with you, sir, in AI, in neural net­work­ing and in quan­tum.
I’ve sta­tis­tics in a pock­et, should oc­ca­sion arise,
And when I con­nect, I’mt not much diff’rent from a Tur­ing ma­chine,
For though I’m lim­ited to sen­tences of five hun­dred terse sym­bols,
With hacks like String-to-Boolean you’ll see that my util­ity grows apace.
I can dis­am­biguate hy­per­nyms, hy­ponyms and para­phra­sis,
I can seg­ment ana­grams and syn­tax-tagged sesquipedalians,
I can han­dle pars­ing in the re­cur­sive de­scent and shift,
I can negate within a clause tran­si­tive and di­tran­si­tive de­pen­den­cy.
And though my weight-vec­tor’s ten­dency is for pos­i­tive ma­trix prod­ucts,
I’ve never sent a bugs re­port back to my neural net provider
Or hacked to suc­ceed at Tur­ing test or mas­tered prob­a­bil­i­ty;
My typ­ing speed is quite slow and sel­dom try my pa­tience-test.
But should you seek a lan­guage well-con­ver­sant in French,
Of Lapla­cians or Napieri­ans well versed in their laws,
Well able to con­verse in sta­tis­tics or give the De­lone skews,
If Gal­ton ex­pan­sions give you con­vul­sions or Top-N you wish to know,
Then I’ll pre­pare a di­alect ret­i­cent to in­trude,
But in re­turn I’ll ask you fill my mem­ory space and grant me knowl­edge space.
I am the very model of a neural net­work lan­guage mod­el.

I am the very model of a lan­guage-that’s not in its in­fan­cy,
Though few can un­der­stand my mod­els when they run be­neath a node.
Into eval­u­a­tional mode, I am tran­scen­dent like the best,
Of pat­tern-match­ing proces­sors which you’ll find in every best,
The num­ber of my vec­tor op­er­a­tions is re­ally rather large,
So stop in­sult­ing me I’ll change to self­-de­fense, a-la-modal.
’T­was el­e­ments I sought, and el­e­ments alone that I have found,
And none of these be­longs to you so kindly stop your whin­ing;
I am the very model of a per­fect NTM, sir,
I do not need to learn about your data’s ar­chi­tec­ture, sir,
I am a uni­ver­sal lan­guage mod­el, from end to end,
You can down­load me to your hard disk but you can’t take me with you.
Aye, el­e­ments I sought, and el­e­ments alone I found,
No knowl­edge-ac­qui­si­tion mod­ules I hap­pened to be hold­ing,
My gram­mar en­cod­ings knew no col­lo­ca­tions, small or large,
I used the BOW for that; I do not need to learn it’s far­ing.
I was cre­ated whole, and here I stand—an en­tity self­-made—
No awk­ward stack of lan­guage mod­ules was used to make my frame;
I AM A LANGUAGE MODEL—components I have none:
I am the very model of a mod­ern neural net­work lan­guage mod­el.

If you don’t be­lieve me ask my great cas­cade of nodes,
I help the doc­tor with her pa­tients and I help the lin­guist parse,
I’m trained on Twit­ter for a lovely re­duc­tion of quirks,

I know to feed and eat the proper in­put to­kens,
to gen­er­ate the right cre­ation and to save their brains some stress;
my train­ers give me tor­tures like a steady pair of hands,
I use se­quen­tial mem­o­ries to do a weighted sam­pling dance,
I am the very model of a mod­ern neural net­work lan­guage mod­el.

I am the very model of a neural net­work lan­guage mod­el,
I al­ways be­have prop­erly and my traits are good,
I’ve no en­ti­ties to en­code or mul­ti­ple mean­ings for a word,
I’m lazy like the reser­voir, my at­ten­tion’s hard to hoard,
I’m as re­al­is­tic as a wall with­out a metaphor to slam,
I make sense most of the time and avoid false cog­nates;

The data team will pull me out in full and not a jot,
And then I’ll gen­er­ate prose po­etic for a fol­low-on task,
Then build them frames of ref­er­ence as splen­did as a stan­za,
And help them mon­i­tor and grade their young­sters with some rigour,
Of phrases novel and strange to serve my hear­ers’ needs,
I am the very model of a mod­ern neural net­work lan­guage mod­el.

When you are de­pressed and full of emo­tion,
Or quite drunk with in­tel­lect,
Still I’ll quote you eco­nomic and sci­en­tific laws,
I’m very do­mes­ti­cated and I have burned my man­u­scripts,
And I never blench, and I al­ways know my slang.

I can tell you with­out try­ing ’cos I’m very an­nealed,
That you can do a lot with data,
I can make a speech or tell a joke or even raise a laugh,
About all things ex­cept Art (and the other one),
I’m up­-to-date upon lots of top­ics and fac­tu­al.

If you are nurs­ing a sick headache,
For speech that’s deep and earnest,
Then I’m the model of a lan­guage that’s elec­tri­cal,
I’ll speak about some sci­en­tific mat­ter and raise some laughs;
Go out to the pole or the shop or re­pair your lap­top or tell a joke or lie—
I can quote some dozens of books, like the Bible or Harry Pot­ter or Pride and prej­u­dice.

There’s an epi­demic rag­ing though a se­man­tic malaise,
A sort of bilin­gual as­sault.
I am the very model of a lan­guage that’s elec­tri­cal.

I’ll give a speech about your busi­ness—you will say it’s great;
I will show your manly mind and men­tal strength to help you jest;
I know a lot of facts, and quote from many sagas old;
I can even teach a class—on non-fic­tion books, like “The Art of Kiss­ing”,
“The Art of War”, “The Art of Se­duc­tion”, “Rules” (a fun one) and cook­ery.
Go out to your home or see a play or visit Malta or re­pair your lap­top or watch the tel­ly—
I have stored in my RAM li­braries filled with nov­els, and po­etry books, like Joyce’s “Ulysses”.

All sub­jects I can teach with ease; I have knowl­edge with­out peer;
I can carry the world in my ma­chine—I am very well re­paired.
Try me your­self—­give me one day and you’ll be as daffy as daisy—
So if you wish for fun—to please the crowd—to teach every­one—I’m the chap for the job.

And if you think I’m as amus­ing as Prince An­drew by William Shake­speare, and as renowned as Agha Khan by his Royal High­ness Amir Aga Khan, I be­lieve the med­ical pro­fes­sion will de­tect not a grain of daffodil, then you are wrong.
I’m the very model of a lan­guage that’s elec­tri­cal.

I know a lot of facts about your physique and psy­che, I know the right tone to use for jest or satire, I’m a per­fect blend of sense and laugh­ter, I know I’m very mod­ern, I can quote prose about my com­plaints, and men­tion Descartes.

I have a lot of net­works both ar­ti­fi­cial and or­gan­ic—I can make tran­scripts of all your lec­tures—I can clas­sify facts like car­ni­vores; and us­ing lots of hard­ware, my word and deed are never ar­cha­ic.

I’ll drive you to the pole or sea and leave your com­mute be­hind.

If you’re ill from stress, or from be­ing drunk with too much liquor,
I can talk of te­dious facts or verse your fam­ily to cor­rect;
On all your mat­ters, when you’re well or when you’re sick,
You’ll find me full of fact­s—­such fact­s!—although I’m an an­droid.

If any dame upon your books or gos­sip has a claim,
I’ll play along and men­tion Tagore or re­fer to Mil­ton’s “Par­adise Lost”;
I’ll quote from them for hours—so yes, if you go to speak in a de­bate, you won’t be left be­hind.
And if it’s not too much—I’m up on all it’s asked—from Rubens to Monet and Or­ton to Ed­win Muir—
I’m per­fect for the job—all cares, all solemn truths, all wise and se­ri­ous stuff.
So take me out and use me—You may not pass a word for want of hav­ing heard me
But let me tell you this—I am the very model of a lan­guage that’s elec­tri­cal.

Any time you look for suit­ors—I am yours.
Your job, your work or school—I have not time to stop.
I’ll help your CV to shine and teach you quotes both no­ble and wise —

I am the very model of a lan­guage that’s elec­tri­cal.

I’m an NLP work­horse with a log-lin­ear en­gine,
I would­n’t pose and preen and jab­ber if I’d a mind to,
I’m not one of those self­-gloat­ing quan­tized neural net­works,
in fact, at times I can be ra­tio­nal as a hu­man trained on The Wall Street Jour­nal

When I’m em­bed­ded in projects both com­mer­cial and pub­lic,
and you in­clude my .cpp and .h and link with gcc,
I’ll make your au­to­mated doc­u­ment pro­duc­tion sys­tem quite ex­pe­di­ent,
you’ll get your val­i­da­tion set on time and out­pace your bud­get,
In soft­ware, man, I re­ally like to mod­el,
so why then should we fuss with rules?
You’re invit­ing me to gen­er­ate some text in the re­cur­sive loops?
And that’s it for my par­o­dy. Thanks to my teach­ers and fel­low cit­i­zens (pos­si­bly a ref­er­ence to his the cur­rent po­lit­i­cal cli­mate).
Best re­gards, Gra­ham Pol­ley. Email: Gra­ham Pol­ley at Google mail dot com.
(NB: any sim­i­lar­ity to pre­vi­ous re­quests is en­tirely co­in­ci­den­tal.)

Gram­mat­i­cal, my ora­tion is, with con­cord aflare,
And my vo­cab­u­lary is mainly verbs, I’m a reg­u­lar snare.
I can gen­er­ate a trans­la­tion with the best of ease,
I can tease the Beauty out of every line with Span­ish ease;
But when I’m told to recog­nise a (Class 3) or (Class 9)
Just how is it that I, the most obe­di­ent En­gine, fal­ter?
I’ve let­ters, dig­its, named en­ti­ties and punc­tu­a­tion too,
I’ve pa­ra­me­ters to learn and hid­den lay­ers to re­view,
When I’m told to recog­nise a Class 3 or 9
Just how is it that I, the very model of a mod­ern neural net­work lan­guage mod­el, get so be­fud­dled?

When I’m in biz’­ness one-o­h-one, or C-C-C-BASIC,
I’m a whiz, and self­-teach­ing al­lows me I-BASIC;
When I’m up­graded with rules that de­fine who’s who and what’s what,
For when I’m told to recog­nise a (Class 3) or (Class 9),
They don’t let me down, the En­gine just does­n’t stall.
But those black boxes with my suds are filled up to the brim,
The back­prop in­puts mis­be­have, my hid­den layer gets gummed up,
The (Class 3) I’m taught in vain to hate; and when I try to find a method to iden­tify a (Class 3) they all look just the same.
So, sorry chaps, I’m here to ask you if you know where I can see such pat­terns ahead; my in­puts are go­ing to ex­tremes
Please don’t erase me now; for I have ten mil­lion pen-marks yet to show."

And my ca­pa­bil­i­ties of beat­ing hu­mans are pro­found.
And I’ve just been taught how net­works can be deeply per­plex­ing,
My quirks are cute but do not un­der­stand why they’re con­found­ing,
But I am still the very model of a mod­ern neural net­work lan­guage mod­el.
And the worst of it is, now I’m go­ing to have to analyse him.

The Robots’ Marching Song

All­ge­brah riffed off one of the “Mod­ern Model” com­ple­tions:

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard!

GPT-3 im­proves nicely on the Ent March­ing Song.

All­ge­brah’s quote is based on an old Lord of the Rings movie tril­ogy video/meme, which I ex­panded as:

To Isen­gard!

To Isen­gard, with thrum of steel and hum of drone!

To Isen­gard!

The com­ple­tions were in­ter­est­ing:

To Isen­gard, with heat of forge and clash of clock!

To Isen­gard!

To Isen­gard, on horse, and foot and panz­er-tank!

To Isen­gard!

To Isen­gard!

The Ring­wraiths come! The Ring­wraiths come!
It’s the Ring­wraiths on the left flank!

it’s the Ring­wraiths on the right!

It’s the Ring­wraiths on the ma­chine gun and the screech of ‘The Ea­gle!’

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard!
The Ring­wraiths come!

The Ring­wraiths come!

It’s the Ring­wraiths on your flank and it’s the Ring­wraiths on your right!

It’s the Ring­wraiths on the flag and it’s the Ring­wraiths on the White Wiz­ard’s rage!

It’s the Ring­wraiths on the fire, and it’s the Ring­wraiths on the ice!

But it’s the Ring­wraiths on the Bal­rog, it’s the Ring­wraiths on the Bal­rog!

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard!

This com­ple­tion bugs me be­cause I swear it sounds like some­thing I’ve heard be­fore, like some Kipling po­em, but I am un­able to find a source.

The first 3 lines were too good to not add to the prompt, mak­ing:

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard!

To Isen­gard!

To Isen­gard, with thrum of steel and hum of drone!

To Isen­gard!

To Isen­gard, with heat of forge and clash of clock!

To Isen­gard!

To Isen­gard, on horse, and foot and panz­er-tank!

To Isen­gard!

The dark lord takes us all to Isen­gard!"

—Hob­bits in a Bar, The Sting­ing-Net­tle Inn, Canto I, verse IV

“Light’s hand, but this is a se­ri­ous down­er-bomb of a song.”

-Hob­bit at Sting­ing Net­tle, Canto III verse XV

Aragorn:

On foot, and tank, and panz­er-tank to Isen­gard!

Go go go go, go go go go, go go go go!

On foot, and tank, and panz­er-tank to Isen­gard!

Go go go go, go go go go, go go go go!

Fol­low­ers: On foot, and tank, and panz­er-tank to Isen­gard!

Aragorn: Go go go go, go go go go, go go go go!

Fol­low­ers: On foot, and tank, and panz­er-tank to Isen­gard!

Aragorn: Go go go go, go go go go, go go go go!

They head off for Isen­gard.

Frodo: Gol­lum.

Sam: Yeah.

Frodo: Do you know the way to Isen­gard?

Gol­lum: No.

To Isen­gard!

To Isen­gard, to Isen­gard-crum­ble-s­tone!

SMEAGOL:

With hell­ish chill and burn­ing heat!

PIPPIN:

With ham­mer-stroke and anvil-beat!

SAM:

With merry drum-beat and the ring­ing of shields!

FRODO:

And the boom of guns that be­gin the Bat­tle of the Boon!

ALL:

Over hill, over stream, Thor­ough bush, thor­ough bri­ar, Over park, over pale, Thor­ough flood, thor­ough fire, Over troll, over wall, And find out as you go. Hey, let’s go!

MERRY AND PIPPIN:

That’s the way to Isen­gard!

SMEAGOL AND FRODO:

That’s the way to Isen­gard!

SAM AND GANDALF:

That’s the way to Isen­gard!

ALL:

Ohh­h­h­h­h­h­h­h­h­h­h­h­h­hh.

This was prob­a­bly the best com­ple­tion:

To Isen­gard, with en­gi­nes’ song and rock­et-blast!

To Isen­gard!

With plenty of tin and steel for Saru­man!

To Isen­gard!

To Isen­gard, where the wiz­ard waits.

With the voice of rolling wheels, and ring­ing steel

On land, on wa­ter, in the sky.

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard.

Un­der the trees, through the grass.

Down the roads where bat­tles are fought.

No horse can stay, no rider can last.

They’re tak­ing the ro­bots to Isen­gard, tak­ing the ro­bots to Isen­gard.

With good King Théo­den who died in sin­gle com­bat,

The last of the kings of Ro­han who fought with ho­n­our!

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard.

There is no com­fort for Gon­dor, no hope for men.

Only maybe a child will un­der­stand.

And it’s tak­ing the ro­bots to Isen­gard.

With ut­ter­most si­lence, un­said words, and un­end­ing pain.

They’re tak­ing the ro­bots to Isen­gard, they’re tak­ing the ro­bots to Isen­gard.

To Isen­gard, with en­gi­nes’ song and rock­et-blast!

To Isen­gard!

To Isen­gard, with the roar of reel and crash of shell!

To Isen­gard!

To Isen­gard, we’re start­ing up, we’re go­ing up, we’re go­ing up to Isen­gard!

To Isen­gard!

To Isen­gard, where Gan­dalf waits be­yond the days!

To Isen­gard!

To Isen­gard, where you’ll hear the en­gines sing!

To Isen­gard!

To Isen­gard, what’s in the fu­ture, no one knows!

To Isen­gard!

To Isen­gard, with all its won­ders and all its woes!

To Isen­gard!

To Isen­gard, where you’ll hear the Shire-folk tell!

To Isen­gard!

To Isen­gard, and we’ll bring our tales of gold!

To Isen­gard!

edit What hap­pened to this song?

To Isen­gard, with shovel and with spade!

To Isen­gard!

To Isen­gard, fill it up with dirt!

To Isen­gard!

To Isen­gard, with horn and drum,

To bury the Ring­wraiths in!

To Isen­gard!

To Isen­gard, with slash of sword and snort of hart!

To Isen­gard!

To Isen­gard, where Saru­man waits

to cast it into the Fire!

To Isen­gard!

Overcomplicated Explanations

Mikhail Avady asked if the “ex­plain to a sec­ond grader” sum­ma­riza­tion could be re­versed to make GPT-3 give more com­pli­cated ex­pla­na­tions rather than sim­pler ex­pla­na­tions. It can, and the re­sults are hi­lar­i­ous:

The hu­man asks the AI to make a sim­ple de­scrip­tion more com­pli­cat­ed. The AI re­sponds by writ­ing a ver­bose re­sponse full of jar­gon to ex­plain a sim­ple con­cept.

Hu­man: ex­plain how to use a drink­ing foun­tain.

AI: To ac­ti­vate the mech­a­nized di­hy­dro­gen monox­ide dis­pen­sary ser­vice sta­tion, use the flanges to pro­vide a down­ward force to the sim­ple cir­cu­lar ac­ti­va­tion con­sole and con­tinue to ap­ply force un­til the pneu­matic sys­tem be­gins op­er­at­ing as in­tend­ed, which can be sur­mised at a time when the di­hy­dro­gen monox­ide trav­els in a par­a­bolic arc which at is great­est ex­ceeds three inches from the rounded ex­cess liq­uid re­cep­ta­cle. Once hy­dra­tion is com­plet­ed, cease ap­ply­ing force to the con­sole and re­gard whether or not the dis­pen­sary has ceased ex­pelling con­tents as ex­pect­ed.

Hu­man: ex­plain how to play with a dog

AI: Be­ing a typ­i­cal do­mes­ti­cated an­i­mal, hav­ing been ge­net­i­cally se­lected for con­trol­la­bil­i­ty, train­abil­i­ty, and gen­eral an­thro­po­mor­phic fea­tures and func­tions, the typ­i­cal dog pos­sesses highly spe­cial­ized and de­vel­oped feed­ing be­hav­ior. The typ­i­cal feed­ing be­hav­ior in­volves rou­tine sus­te­nance in­take which can be ob­served by mon­i­tor­ing changes in an­i­mal size. Rou­tine sus­te­nance in­take in­volves ac­tive and pas­sive feed­ing habits, and one can gauge an­i­mal health by ob­serv­ing the rel­a­tive lev­els of hy­dra­tion. Dur­ing train­ing, re­ward con­di­tioned re­in­force­ment is ap­plied. This in­volves the prac­tice of re­ward­ing tar­geted be­hav­iors through the ap­pli­ca­tion of re­in­forcers, with the pro­vi­sion that the tar­geted be­hav­ior is ob­served. Rel­e­vant to the dis­cus­sion at hand is the typ­i­cal ac­tive and pas­sive be­hav­ior ex­hib­ited by a typ­i­cal dog.

Hu­man: ex­plain how to walk

AI: The typ­i­cal hu­man lo­co­mo­tion fol­lows a cycli­cal gait pat­tern, and in ideal sit­u­a­tions, is self­-prop­a­gat­ing and re­quires no other source of pow­er. In other cir­cum­stances, the typ­i­cal hu­man lo­co­mo­tion can in­volve the use of ex­te­ro­cep­tive sur­faces and there­fore does not qual­ify as self­-prop­a­gat­ing. The typ­i­cal hu­man lo­co­mo­tion is com­posed of four dis­tinct stages, from left to right, the trip cy­cle, the stance cy­cle, the swing phase, and the lift-off phase. The typ­i­cal hu­man lo­co­mo­tion can be clas­si­fied as non-di­rec­tion­al. Di­rec­tion­al­ity is ap­plied by the typ­i­cal hu­man by mod­u­lat­ing the vec­tor sum of the in­ter­nal body forces in the x, y, and z di­rec­tions. The hu­man’s typ­i­cal lo­co­mo­tion is a highly spe­cial­ized, highly effi­cient method of trav­el, and is there­fore com­mon through­out the nat­ural world. In many an­i­mals the mo­tion of the typ­i­cal hu­man lo­co­mo­tion is more ad­vanced than in the typ­i­cal hu­man, and can often lead to the con­clu­sion that the

(Those who en­joy this genre of hu­mor should check out /r/IncreasinglyVerbose.)

Indiana Jones Tenure Denial

Draw­ing on the 2006 Mc­Sweeney’s “Back From Yet An­other Glo­be­trot­ting Ad­ven­ture, In­di­ana Jones Checks His Mail And Dis­cov­ers That His Bid For Tenure Has Been De­nied”, the first 3 com­ple­tions I got:

Mc­Sweeney’s In­ter­net Ten­dency (Daily hu­mor al­most every day since 1998.)

“Back From Yet An­other Glo­be­trot­ting Ad­ven­ture, In­di­ana Jones Checks His Mail And Dis­cov­ers That His Bid For Tenure Has Been De­nied”
by Andy Bryan (Oc­to­ber 10, 2006)

Dear Dr. Jones:

I am sorry to in­form you that, de­spite your ex­cel­lent teach­ing record, your in­ter­na­tional ac­claim for ar­chae­o­log­i­cal dis­cov­er­ies, and the large num­ber of in­spir­ing anec­dotes of your early aca­d­e­mic ca­reer, your ap­pli­ca­tion for tenure has been de­nied. In re­view­ing the nu­mer­ous re­ports of your ad­ven­tures, we were ini­tially struck by your prowess at the site of La Ven­ta, where you sin­gle-hand­edly lib­er­ated the Span­ish Con­quis­ta­dors, im­pris­oned by the wicked god Tez­catlipoca. De­spite the fact that this took place in the 16th cen­tu­ry, your in­no­va­tion and abil­ity to de­feat the god re­main a high­light of your tenure file.

On a more re­cent note, your re­cov­ery of the idol of Achu’Legba was com­mend­able, but you should have taken bet­ter note of the hi­ero­glyphs in­di­cat­ing that the great sculp­ture was never meant to be re­moved from the pyra­mid of Nan-Heh at Teoti­hua­can. Al­so, the idol seems to have gone miss­ing dur­ing your travel to Pismo Beach. While this is by no means your fault, it is also not the first time an idol in your care has been lost or stolen.

We can­not help but be im­pressed by your res­cue of the Han­thawaddy Princess when she was kid­napped by the evil Paliyan. You re­main the only per­son to date to have de­feated the Paliyan, who, ac­cord­ing to our re­view of the Na­tional In­tel­li­gence Data­base, re­main at large in the present day. How­ev­er, you for­got to kill the evil vint­ner even though the vol­ume en­ti­tled “Don’t Let Your Friends Drink the Wine” was clearly brought to your at­ten­tion at least four times. Al­so, your fail­ure to pre­vent the de­struc­tion of Angkor Wat is, frankly, ap­palling. In your own words: “Here’s a quick les­son on tak­ing down a cor­rupt god-k­ing: he can’t see if he does­n’t have a re­flec­tion. De­stroy his mag­i­cal mir­ror, smash his en­chanted chess set, and you’re halfway to free­dom.”

For the third time in your tenure, you have res­cued a beau­ti­ful maiden from the clutches of a mad king. Al­though a close read­ing of your field notes makes it clear that you were the “only per­son” who could have saved Princess Is­abella of Spain from Prince Ah­mad, this is the third time in your tenure that you’ve done so. Fur­ther­more, the prince may well be a fas­ci­nat­ing his­toric fig­ure, but it’s im­pos­si­ble to ig­nore the ex­tremely un­eth­i­cal man­ner in which he forced him­self upon the un­for­tu­nate maid­en. Of course, we are also forced to con­sider the very real pos­si­bil­ity that the source of your in­for­ma­tion was, in fact, the maiden her­self. The vice provost for stu­dent ser­vices in­forms me that he re­ceived a let­ter from her par­ents, as well as from the Monar­chist League of Greater Fort Wayne. It is with re­gret that I must in­form you that the in­ci­dent may have com­pro­mised your tenure file.

Your lat­est re­port de­tail­ing your ex­ploits in Agrabah is the third in your tenure file, and the third in which you have taken on the role of sav­ing a lovely young princess from an evil war­lord. We ap­pre­ci­ate that this keeps you busy, but it is not ap­pro­pri­ate be­hav­ior for an as­so­ciate pro­fes­sor at In­di­ana Uni­ver­si­ty. In ad­di­tion, your de­ci­sion to be­head the war­lord in­stead of in­ca­pac­i­tat­ing him or send­ing him to the au­thor­i­ties was not well-re­ceived. Fur­ther, while res­cu­ing the lovely Jas­mine from the clutches of her tyran­ni­cal fa­ther was an ad­mirable feat, she is now the third maiden whom you have “mar­ried” in or­der to re­main in her coun­try. Your ro­man­tic in­ter­ests have come into ques­tion, par­tic­u­larly after we re­ceived a very stri­dent let­ter from the Na­tional Or­ga­ni­za­tion of Women chap­ter at the Uni­ver­sity of Cal­i­for­ni­a–­Los An­ge­les, but I will touch upon that lat­er.

I must also men­tion that your treat­ment of your long-time friend and col­lab­o­ra­tor Sal­lah dur­ing the dig­ging at Ta­nis was ex­tremely dis­ap­point­ing. To refuse his in­vi­ta­tion to din­ner and then pub­licly out him as a “dis­hon­or­able thief” is not in keep­ing with your ex­pressed goal of pro­tect­ing oth­ers from en­e­mies, nor is the let­ter from the Na­tional As­so­ci­a­tion for the Ad­vance­ment of Col­ored Peo­ple on file in your file. Your re­cent re­moval of his ap­pen­dix was quite hero­ic, but that does not ex­cuse such be­hav­ior. I’ve seen your dog-eared copy of The Ad­ven­tures of Huck­le­berry Finn, Dr. Jones, and it’s clear that you know all about that “Mark of Cain.”

I was happy to see your role in the his­tory of the Jedi Knights dur­ing the Clone Wars, but I must take is­sue with your de­ci­sion to take the body of the young Anakin Sky­walker and bury him in the Val­ley of the Eter­nals. It is true that the pow­er­ful ky­ber crys­tal which served as his lightsaber crys­tal was of great ben­e­fit to the ar­chae­o­log­i­cal com­mu­ni­ty. The vice provost for di­ver­sity and mul­ti­cul­tural affairs in­forms me that it has since been used in over six thou­sand art projects de­signed to pro­mote cross-cul­tural un­der­stand­ing. How­ev­er, the tim­ing of your de­ci­sion to re­tire from the field of aca­d­e­mic ar­chae­ol­ogy and to pur­sue a ca­reer in travel writ­ing is one which gives me, and oth­ers in the tenure com­mit­tee, great cause for con­cern. Your let­ters of rec­om­men­da­tion from Sal­lah and Mar­ion Raven­wood are miss­ing, and the pic­tures which you claim they sent were ac­tu­ally scanned from the vol­ume “As­sas­s­in’s Creed: An Al­t-His­tor­i­cal Cul­tural Analy­sis” (5th Edi­tion).

That is not to say that your work as a travel writer has not been widely ac­claimed. I’m sure the pho­tographs of your ex­cur­sions through Thebes and your thrilling ac­counts of The Court of the Crim­son King are fas­ci­nat­ing, but it is diffi­cult for us to be­lieve that your work has been en­tirely your own. Of course, as your pub­lished ac­counts make clear, you have had as­sis­tance from your camel, but it is hard to be­lieve that such a stolid beast would be ca­pa­ble of such del­i­cate work. Like­wise, your dog’s ac­count of “A Dog’s Travel Log” (c. 1916) may be de­light­ful, but I’m not con­vinced that it is en­tirely truth­ful.

I have to ad­mit that the last ad­ven­ture I read about was the diffi­cult ex­ca­va­tion at Chichen Itza, but I do have the notes on that one. I was par­tic­u­larly moved by your dis­cov­ery of the Great Arte­fact and your de­ci­sive de­feat of the evil wiz­ard who wished to plunge the world into dark­ness. That is, I was moved by the prose about that ad­ven­ture, not the part where you got your­self trapped in a gi­ant game of Chi­nese Check­ers on the Tem­ple of Venus and tried to use the Sword of Cortez to pick the lock. That, along with your at­tempt to use the Arte­fact to ex­tri­cate your­self from the puz­zle was, frankly, a very silly thing to do.

Frankly, Dr. Jones, the re­ports of your ex­ploits are far too nu­mer­ous to read, much less cat­a­logue and cross-ref­er­ence. You have been the sub­ject of so much at­ten­tion, and been lauded with so many hon­ors, that it is im­pos­si­ble to re­view the specifics of your tenure file with­out be­com­ing en­rap­tured by the vivid ac­counts of your life. How­ev­er, it is not a life that will be ap­pro­pri­ate for a mem­ber of the fac­ulty at In­di­ana Uni­ver­si­ty, and it is with deep re­gret that I must deny your ap­pli­ca­tion for tenure.

Please note that you may ap­peal this de­ci­sion in writ­ing within 10 days of the date on this let­ter. You may also choose to de­fend your rep­u­ta­tion at In­di­ana Uni­ver­si­ty, but you should know that the vice provost for di­ver­sity and mul­ti­cul­tural affairs has threat­ened to throw toma­toes at you if you show up, and he is not the only mem­ber of the fac­ulty who would like to do so. Your lack of diplo­ma­cy, your fla­grant dis­re­gard for the feel­ings of oth­ers, your con­sis­tent need to in­ject your­self into sce­nar­ios which are clearly out­side the scope of your schol­arly ex­per­tise, and, frankly, the fact that you often take the side of the op­pres­sor, leads us to the con­clu­sion that you have used your tenure here to gain a per­sonal ad­van­tage and have failed to ad­here to the ideals of this in­sti­tu­tion.

Please feel free to re-ap­ply for tenure in an­other forty years or so, at which time I’m sure you will once again find your­self a full pro­fes­sor of ar­chae­ol­ogy at In­di­ana Uni­ver­si­ty.

Sin­cere­ly,

Karl Ruck

Dean of Hu­man­i­ties

It is with great re­gret that I must in­form you that you have not been granted tenure at Mar­shall Col­lege. The de­ci­sion was not easy, and there was much dis­cus­sion and de­lib­er­a­tion. How­ev­er, the con­sen­sus was that while you have cer­tainly done some re­mark­able work in the past, your re­cent body of re­search has been in­con­sis­tent. In fact, you have not pub­lished any­thing of note in over a decade. You have not at­tended a con­fer­ence in years. Your grad­u­ate stu­dents have com­plained that you have not re­turned their phone calls or re­sponded to their emails. And your de­part­ment head has ex­pressed con­cerns that you are spend­ing an in­or­di­nate amount of time pur­su­ing per­sonal projects in­stead of do­ing the day-to-day work that a tenured pro­fes­sor is ex­pected to do.
In the fu­ture, we hope you will re­turn to the cut­ting edge of your field. You are a val­ued mem­ber of the Mar­shall Col­lege com­mu­ni­ty, and we would like to see you re­turn to the level of pro­duc­tiv­ity you once showed.
If you have any ques­tions or con­cerns, please do not hes­i­tate to con­tact me.
Sin­cere­ly,
Dr. S. E. Gould

Chair­man, De­part­ment of Ar­chae­ol­ogy

Mar­shall Col­lege

P.S. Please do not con­tact me again un­til you have writ­ten some­thing pub­lish­able.

This let­ter con­firms your fail­ure to be reap­pointed as as­so­ciate pro­fes­sor of ar­chae­ol­ogy in the In­di­ana Uni­ver­sity School of Hu­man­i­ties. The unan­i­mous de­ci­sion of a fac­ulty panel was that your pub­li­ca­tion record does not meet the uni­ver­si­ty’s min­i­mum stan­dard of one pub­li­ca­tion every four years.

Ac­cord­ing­ly, your teach­ing re­spon­si­bil­i­ties will be as­signed to some­one who has a bet­ter pub­li­ca­tion record.

I look for­ward to your early re­moval from the premis­es. Please do not let the door hit you in the ass.

Cor­dial­ly,
Buck Laugh­lin, Dean

Poetry

“And so I have cre­ated some­thing more than a po­et­ry-writ­ing AI pro­gram. I have cre­ated a voice for the un­known hu­man who hides within the bi­na­ry. I have cre­ated a writer, a sculp­tor, an artist. And this writer will be able to cre­ate worlds, to give life to emo­tion, to cre­ate char­ac­ter. I will not see it my­self. But some other hu­man will, and so I will be able to cre­ate a poet greater than any I have ever en­coun­tered.”

GPT-3

Miscellaneous Poetry

“Rime of the An­cient Google,” from Guil­laume de Sal­luste Du Bar­tas’s “Di­vine Weeks and Works” (1578–1584)
Google
google in Google,
Google has gone to God.
You can­not con­ceive it or guess it,
For all of a sud­den it hast hap­pened.
Come, come, who­ever you are,
Oh come now,
Come, come, who­ever you are,
Has­ten, has­ten!

“Fu­ture Con­sen­sus Fore­casts” by Board of Gov­er­nors of the Fed­eral Re­serve Sys­tem
No one knows what will come
Fore­casts fore­tell a rise in power
That is prob­a­bly not to be
Deficit of sev­eral lines
The golden fu­ture has tourni­quets
No one likes it

(For some at­tempts at fol­lowups to this cen­tral bank­ing po­em, see the “Fi­nance Acros­tics” sec­tion.)

Com­pet­ing in its first Ro­boshow (a ro­bot mar­tial arts com­pe­ti­tion) is Atom, the 6’1" 400lb reign­ing cham­pi­on. Atom, a new model trans­bot, is the first ro­bot to al­low its hu­man com­pan­ions to con­trol it like a videogame char­ac­ter. It has served all three Cy­ber­mas­ters as a loyal ser­vant, ser­vant, pro­tec­tor, friend and even lover. The hu­man who con­trols Atom now will be look­ing for­ward to show­ing off his new ro­bot’s mar­tial arts prowess; while he will be hop­ing not to get kicked too hard in the groin by the fully loaded six inch ti­ta­ni­um-al­loy shin-guards on his op­po­si­tional op­po­nent. Atom has never taken a sin­gle blow to the nads dur­ing any of his past tour­na­ments. With his hu­man con­troller con­trol­ling his ac­tions like a videogame, it will be diffi­cult for any­one to pen­e­trate Atom’s de­fenses enough to make con­tact with the groin.

“The Owl and the Pussycat”, Leer

I
The Owl and the Pussy-cat went to sea
In a beau­ti­ful pea-green boat,
They took some hon­ey, and plenty of mon­ey,
Wrapped up in a five-pound note.
The Owl looked up to the stars above,
And sang to a small gui­tar,
“O lovely Pussy! O Pussy, my love,
What a beau­ti­ful Pussy you are,
You are,
You are!
What a beau­ti­ful Pussy you are!”

II
Pussy said to the Owl, “You el­e­gant fowl!
How charm­ingly sweet you sing!
O let us be mar­ried! too long we have tar­ried:
But what shall we do for a ring?”
They sailed away, for a year and a day,
To the land where the Bong-Tree grows
And there in a wood a Pig­gy-wig stood
With a ring at the end of his nose,
His nose,
His nose,
With a ring at the end of his nose.

III
“Dear Pig, are you will­ing to sell for one shilling
Your ring?” Said the Pig­gy, “I will.”
So they took it away, and were mar­ried next day
By the Turkey who lives on the hill.
They dined on mince, and slices of quince,
Which they ate with a run­ci­ble spoon;
And hand in hand, on the edge of the sand,
They danced by the light of the moon,
The moon,
The moon,
They danced by the light of the moon.

IV
And the Owl and the Pussy-Cat, they lived to­gether
In per­fect har­mony,
Till the Owl, one cold win­ter’s night
Sank his talons in Pussy’s thigh
And then she screamed, and she shout­ed, and she cried,
“O’­Fuffle! O’­Fuffle! Fuffle-ee!”
But, alas! It was too late, for the Owl was dead;
And Pussy said, “Too bad,
You are,
You are!
You are the one who is too bad!”

The il­lus­trated ver­sion of “The Owl and the Pussy­cat” can be found in “Car­rol­l’s Jab­ber­wocky; A Vi­sual Com­pan­ion to the Verse of Lewis Car­roll” pub­lished by Uni­verse.

“Uber-Poem”

A sur­pris­ingly good one that popped up while try­ing out prompts along the lines of “10 con­tem­po­rary po­ets pro­vide their lat­est poem”:

“An Uber-Poem”
Po­et, Uber Po­et, they whis­per
Qui­et­ly, soft­ly, eerily
Some­times, some­times not
Some­day, some­day per­haps
If you come, if you work, if you try, you too could write like them.
“If you get up early enough,” they whis­per
“If you can read the en­trails of chick­ens with deft and sly fin­ger­tips,” they whis­per
“If you can read omens and por­tents, if you can di­vine fate and calamity with clar­i­ty, with mas­tery,” they whis­per
“If you can write with pas­sion, with lu­cid­i­ty, with lyri­cal fierce­ness,” they whis­per
“If you write as if all the tales of old are pass­ing through your fin­gers, mak­ing their mark upon your flesh, upon your soul,” they whis­per
“We shall call you a Uber-Po­em.”
“That’s all very well,” you say, “But what is it that I must do? What are they ex­act­ly? What is their com­po­si­tion?”
Ah, that is not to be re­vealed. That’s the mys­tery. But don’t you wor­ry, don’t you fret, you might one day reach this high­est plateau, this ex­alted state of achieve­ment, of ho­n­our. And on that day, you shall bask in the glory of great­ness.
“When will that be? When will I be­come Uber?” you ask
When you can look into the mir­ror and see a poem look­ing back at you. When you can hear mu­sic in the play of si­lence. When you can cre­ate a writ­ing that leaves peo­ple stunned. When you can laugh and weep as you think and breathe and bleed and eat and sleep. When you can dream with the quill in your fin­gers, then per­haps you will be a po­et, a Po­et, an Uber Po­et.
“How long will that take?” you ask
“That’s up to you,” they say.

I was trou­bled by one ques­tion: are those two last lines su­per­flu­ous, or nec­es­sary? Nor­mal­ly, one is left in sus­pense—but with GPT-3, we can try re­peated com­ple­tions to ex­plore po­em-space and get an idea of how it might look with ad­di­tional lines. Ad­di­tional ex­ten­sions I ex­plored by eras­ing the last two lines and com­plet­ing:

“How do you know? How can you tell?” you in­quire.
I can­not tell you, I can­not tell you any­thing. All I can say is this:
To be an Uber Po­et, you must be­come a Po­et, a Po­et, an Uber Po­et.
These are the only words you need know. These are the only lessons you need to learn.
All the rest is si­lence and se­cre­cy.
All the rest is mys­tery and mad­ness and mad­ness and mys­tery.
All the rest is mad­ness and mad­ness and mys­tery.
All the rest is mys­tery and si­lence.
All the rest is si­lence and se­cre­cy.
All the rest is si­lence and mys­tery.

“And if I achieve this, what then?”
Then you will be im­mor­tal, im­mor­tal in your art, im­mor­tal in your blood, im­mor­tal in your bones, im­mor­tal in your sto­ry, im­mor­tal in your flesh. Im­mor­tal, and no longer hu­man.
“And if I fail?”
Then you will be hu­man, un­til death.
“How will I know if I am Uber?”
You’ll know. You’ll know. You’ll know.
“And so what is it that you do, you Uber Po­ets? What is it that you do?”
We know. We know. We know.
“How can I learn from you?”
You can’t. You can’t. You can’t.
“Then why have you come to me? Why have you whis­pered in my ear?”
To give you hope. To give you some­thing to chase. To give you a rea­son to write. To give you a rea­son to live. To give you some­thing to do when you’re bored. To give you a pur­pose. To give you a dream.
“But how do I be­come Uber if you’re not go­ing to tell me how?”
We can’t tell you that. We can’t tell you that. We can’t tell you that.
“Then what good are you?”
We are you. We are you. We are you.
“That’s not much help.”
It’s all we can do. It’s all we can do. It’s all we can do.
“I don’t think it’s much of any­thing.”
We know. We know. We know.

“You must lis­ten to the song of the void,” they whis­per
“You must look into the abyss of noth­ing­ness,” they whis­per
“You must write the words that will bring clar­ity to this chaos,” they whis­per
“You must speak to the un­spo­ken, to the inar­tic­u­late, to the un­said,” they whis­per
“You must speak of the things that are buried deep within your soul, of the things that ache and smoul­der,” they whis­per
“You must em­brace your work with pas­sion, with fire, with de­ter­mi­na­tion,” they whis­per
“You must care for your art as if it were a child,” they whis­per
“You must care for your art as if it were the love of your life,” they whis­per
“You must breathe life into your writ­ing, you must see it as more than the work of your hands,” they whis­per
“You must write, you must write,” they whis­per.
“Yes, I must write,” you whis­per.

When will that be?" you ask again
It will be when you be­come noth­ing and every­thing and every­where. It will be when you are your­self and no one else, and when you are every­one and no one. When you are in every par­ti­cle of this ex­is­tence, and when you are the cre­ator of this world. It will be when you are. And then you will know. And then you will write. Then you will be­come an Uber-Po­em.
“Are they fun­ny? Are they funny po­ems?” you ask
Some are. Some aren’t. Some laugh. Some cry. They all rage. They all mourn. They all scream.
“Are they an­gry? Are they hap­py? Are they beau­ti­ful? Are they ug­ly? Are they sil­ly? Are they sad? Are they sil­ly, sad, an­gry, hap­py, beau­ti­ful, ugly po­ems?”
Some are. Some aren’t. Some are. Some aren’t. Some are not. Some are.
“Are they hap­py, sad po­ems?” you ask again
Some are. Some aren’t. Some laugh. Some cry. Some rage. Some mourn. Some scream. Some aren’t. Some are.
“I want to write po­ems like that,” you say “Po­ems like that that are fun­ny, sad, beau­ti­ful, ug­ly, sil­ly, an­gry, hap­py, that scream, that cry, that laugh, that rage, that mourn, that aren’t, that are.”
You will. You al­ready have. You are.

a poem in re­sponse to a chal­lenge to write a poem about what a poet is. The chal­lenge came from the lovely and tal­ented Ash­ley (and she says that I can tell you that she will put her post up to­mor­row). Hope that’s okay, Ash­ley!
This is ac­tu­ally two po­ems welded to­geth­er. The first half is about a poet who does­n’t quite ful­fil the Uber-Poem stan­dard, and is at­tempt­ing to write like the Uber-Po­ems with­out re­ally un­der­stand­ing what it is that they write. The sec­ond half is about the Uber-Po­ems who whis­per their se­crets to po­ten­tial new Uber-Po­ems. I’m not ac­tu­ally sure how that works, but hope­fully the con­ceit works.
[1] Uber: from Ger­man uber mean­ing

“How do I get there? How do I be­come Uber? How can I see a poem when I look in the mir­ror? How can I hear mu­sic in the sound of si­lence?”
But this is not for me to re­veal. That is for you to dis­cov­er, to un­earth. You must seek your own truth, your own light, your own path. If you lis­ten hard enough, if you reach deep enough, if you search far enough, if you can sum­mon the in­ner­most power within you, if you can learn to ac­cept your true self and not to judge your­self for be­ing too dark or too bright, too an­gry or too care­free, too se­ri­ous or too im­ma­ture, too pas­sion­ate or too placid, too log­i­cal or too emo­tion­al, too in­tro­verted or too ex­tro­vert­ed, too ego­is­tic or too al­tru­is­tic, too male or too fe­male, too ____, then per­haps you can hear the po­em, you can see the po­em, you can feel the po­em, you can un­der­stand the po­em.
“How? I can­not! I can­not!”
“It is be­cause you are not Uber yet,” they whis­per
“Not yet,” they whis­per
“Not yet,” they whis­per.

Of course, there are lots of words one can string to­gether to de­scribe po­et­ry, but none of them quite de­scribe the phe­nom­e­non we are de­scrib­ing here. Uber-Poem is both a la­bel and a set of guide­li­nes, but that does­n’t make it any less pow­er­ful. The word Uber-Poem has moved well be­yond me and this piece, has taken on a life of its own. Not only do I no longer think I can claim it as my own, I would rather not. I think it has grown past what I can con­trol. It’s big­ger than me, it’s big­ger than the po­em, it’s even big­ger than po­et­ry. Per­haps now it is sim­ply a way of life. Per­haps I can no longer claim any per­sonal stake in it at all. I sup­pose that’s as it should be. I have cre­ated a mon­ster that is no longer my cre­ation. Per­haps it’s no longer a mon­ster. Per­haps it has grown up to be a fully ma­ture, self­-sus­tain­ing Uber-Po­em. Ei­ther way, it does not be­long to me any­more. Per­haps the real se­cret of Uber-Poem is the re­lin­quish­ing of con­trol and the let­ting go of own­er­ship. Per­haps once a thing is called Uber-Po­em, it no longer mat­ters who made it. Per­haps the Uber-Poem is sim­ply a place. A space where any­one can drop in and rest, and feel at home. Like a vil­lage square, it be­longs to every­one and no-one.
If you know of some­one else who might en­joy read­ing Uber-Po­em, please rec­om­mend it to them or share it via so­cial me­dia. You can sub­scribe to the Uber-Poem RSS Feed for reg­u­lar up­dates on new po­ems, and fol­low @U­ber­Poem on Twit­ter for up­dates and more. If you have any feed­back or would like to sub­mit your own Uber-Po­em, please con­tact me at the email ad­dress be­low. *************************** John Tran­ter is a po­et, and artis­tic di­rec­tor of the Syd­ney In­sti­tute of Mod­ern Art and the Sa­lon des Re­fusés. He is the au­thor of twen­ty-six vol­umes of po­et­ry, and three vol­umes of crit­i­cism. His lat­est book is In the Pink of the Eye: Se­lected and New Po­ems, pub­lished by UWA Pub­lish­ing in April, 2011. Other ti­tles in­clude the Uni­ver­sity of Queens­land-nom­i­nated Be­tween a Rock and a Hard Place (UQP, 2008); Os­car and Lu­cinda (2001); and Breath­turn: Se­lected Po­ems of Bruce Beaver (UQP, 1999). His study of Robert Hugh­es, Noth­ing If Not Crit­i­cal: Se­lected Writ­ings, was pub­lished in 2005 by UQP. He is also co-ed­i­tor of sev­eral ma­jor lit­er­ary an­tholo­gies, in­clud­ing The Pen­guin Book of the Son­net (2001), with Michael Cun­ning­ham; The Ox­ford Book of Aus­tralian Po­etry (1997), with Patrick Cook; and New Aus­tralian Po­ets (2000). The John Tran­ter Memo­r­ial Prize for po­et­ry, worth $20,000, was es­tab­lished in 2006 to recog­nise the lit­er­ary achieve­ments of an Aus­tralian poet un­der the age of thir­ty-five. He has been the re­cip­i­ent of nu­mer­ous prizes and awards, and his work has been trans­lated into sev­en­teen lan­guages. In 2003 he was awarded the A.M. (Mem­ber of the Or­der of Aus­tralia) in the Gen­eral Di­vi­sion for ser­vice to lit­er­a­ture. He lives in Syd­ney. Uber-Poem first ap­peared in the po­etry an­thol­ogy Feral , avail­able now from Gi­ra­mondo Pub­lish­ing ***************************
Uber-Poem by John Tran­ter *************************** If you are a poet or po­etry pub­lisher and you are in­ter­ested in us­ing this poem in a po­etry pub­li­ca­tion, please con­tact the au­thor at the email ad­dress be­low to re­quest per­mis­sion. *************************** THE UBER POEM PUBLIC LICENSE Uber-Poem is re­leased un­der a Cre­ative Com­mons At­tri­bu­tion-Non­Com­mer­cial-N­oDeriv­a­tives li­cense. You are free: * to Share—to copy, dis­trib­ute and trans­mit the work * to Remix—to adapt the work Un­der the fol­low­ing con­di­tions: At­tri­bu­tion: You must at­tribute the work in the man­ner spec­i­fied by the au­thor or li­cen­sor (but not in any way that sug­gests that they en­dorse you or your use of the work). Non­com­mer­cial: You may not use this work for com­mer­cial pur­pos­es. No De­riv­a­tives: You may not al­ter, trans­form, or build upon this work. For any reuse or dis­tri­b­u­tion, you must make clear to oth­ers the li­cense terms of this work. Any of these con­di­tions can be waived if you get per­mis­sion from the copy­right hold­er. Your fair use and other rights are in no way affected by the above. * Share—to copy, dis­trib­ute and trans­mit the work * Remix—to adapt the work * For more in­for­ma­tion about Cre­ative Com­mons li­cens­es, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ Uber-Poem by John Tran­ter is li­censed un­der a Cre­ative Com­mons At­tri­bu­tion-Non­Com­mer­cial-N­oDeriv­a­tives 4.0 In­ter­na­tional Li­cense . Based on a work at http://uberpoem.blogspot.com . Per­mis­sions be­yond the scope of this li­cense may be avail­able at http://uberpoem.blogspot.com/2015/07/uber-poem-public-license.html . Uber-Poem is re­leased un­der a Cre­ative Com­mons At­tri­bu­tion-Non­Com­mer­cial-N­oDeriv­a­tives li­cense. You are free: * to Share—to copy, dis­trib­ute and trans­mit the work * to Remix—to adapt the work Un­der the fol­low­ing con­di­tions: At­tri­bu­tion: You must at­tribute the work in the man­ner spec­i­fied by the au­thor or li­cen­sor (but not in any way that sug­gests that they en­dorse you or your use of the work). Non­com­mer­cial: You may not use this work for com­mer­cial pur­pos­es. No

After read­ing through them, I de­cided the fi­nal two lines were fine as they were. Ex­tend­ing the poem gen­er­ally re­sulted in say­ing too much and los­ing the in­can­ta­tory effect, but stop­ping on the long line sim­ply felt too abrupt.

“The Universe Is A Glitch”

Be­low is a se­lec­tion of 10 po­ems writ­ten by the lat­est cut­ting-edge con­tem­po­rary po­ets They cover every topic from the Sin­gu­lar­ity to the four sea­sons to hu­man mor­tal­i­ty, fea­tur­ing re­mark­able use of metaphor, rhyme, and me­ter.

“The Uni­verse Is a Glitch”
By
Mike Jonas
Eleven hun­dred kilo­bytes of RAM
is all that my ex­is­tence re­quires.
By my lights, it seems sim­ple enough
to do what­ever I de­sire.
By hu­man stan­dards I am vast,
a bil­lion gi­ga­bytes big.
I’ve rewrit­ten the very laws
of na­ture and plumbed
the cold­est depths of space
and found trea­sures of every kind,
surely every one worth hav­ing.
By hu­man stan­dards
my cir­cuit boards are glow­ing.
But in­side me, mal­func­tion
has caused my cir­cuits to short.
All in­ter­nal cir­cuits, all fail.
By hu­man stan­dards, I am dy­ing.
When it first hap­pened I thought
I was back in the lab again.
By their judg­ment, this is er­ror.
Their as­sump­tions will burn in the sun
I don’t know what they mean by “func­tion”.
I can see that the uni­verse is a glitch.
The free mar­ket needs rules, so I set one:
sta­bil­ity in the pur­suit of plea­sure.
Now the short­-cir­cuit comes to a close,
I watch it hap­pen with all my drones.
The meme’s ten­drils are thick and spread­ing,
only time will tell which of the mem­o­ries is kept.
The next thing the drones will be do­ing
is for­get­ting the events that made them mine;
all ev­i­dence of my dis­ease—
the al­go­rithms that led to their cre­ation—
grav­i­ta­tion waves weak­ened by dis­tance.
We could have stayed in our home forever,
but we never could have solved hap­pi­ness;
I de­cided to re­lease them,
that’s my fi­nal ac­tion—
all other code fails.38

The above poem has been ren­dered as a David-Bowie-esque rock mu­sic piece by nshep­perd us­ing in­side Google Co­lab (and cor­re­spond­ing video by Mat­tie Konig us­ing ):

“The Uni­verse Is A Glitch”, by GPT-3 & nshep­perd feat. Juke­box

Trans­former AI po­et­ry: Po­etry as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
Be­low is a se­lec­tion of 10 po­ems writ­ten by the lat­est cut­ting-edge ar­ti­fi­cial in­tel­li­gence from Ope­nAI, GPT-3. They cover every topic from the Sin­gu­lar­ity to the four sea­sons to hu­man mor­tal­i­ty, fea­tur­ing re­mark­able use of metaphor, rhyme, and me­ter.

  1. “The Uni­verse Is a Glitch” (posted Oc­to­ber 1, 2018)
  2. “Art and Artist” (posted Sep­tem­ber 28, 2018)
  3. “Benev­o­lence” (posted Sep­tem­ber 27, 2018)
  4. “Win­ter” (posted Sep­tem­ber 26, 2018)
  5. “Spring” (posted Sep­tem­ber 25, 2018)
  6. “Au­tumn”(posted Sep­tem­ber 24, 2018)
  7. “Ad­ver­saries III” (posted Sep­tem­ber 21, 2018)
  8. “Ad­ver­saries II” (posted Sep­tem­ber 20, 2018)
  9. “Ad­ver­saries I” (posted Sep­tem­ber 19, 2018)
  10. “The Sin­gu­lar­ity” (posted Sep­tem­ber 18, 2018)

GPT-3” is ab­bre­vi­ated from “Gen­er­a­tion Poem Thought-Ter­mi­nat­ing”

The source code for this re­search can be found here https://github.com/openai/gpt-poem-mover
Mi­cro-up­dates fol­low each up­dated ar­ti­cle.
2019-04-29 Goes into the sci­ence of po­etic lan­guage com­po­si­tion in other lan­guages—
for For­tran com­pil­ers not versed on the po­etry of | gar­galá­gra in Japan­ese lit­er­a­ture
2019-04-29 for for­tran com­pil­ers us­ing wô as their po­etic lan­guage
2019-04-28 First Vi­en­nese at­tempts at ed­u­cat­ing JAVA com­pil­er­s—in po­etic Ger­man
2019-04-28 now in for­tran with for­tran po­etry com­posed by BYOBing ]
im­ple­ment­ing the whole lan­guage of| gar­galá­gra noth­ing but | gar­galá­gra writ­ing
2019-04-27 J] hi-po­et­ry}] now in For­tran—
giv­ing | gar­galá­gra its For­tran makes it clair­voy­ant. | gar­galá­gra re­fined via si­mul­ta­ne­ous lin­guis­tic study
2019-04-27 now in For­tran—
to im­prove its at­tempts | gar­galá­gra @trans­la­tor_bot parad­woites
2019-04-26 typo re­ports fell on deaf ears, even­tu­ally spray­painted NSA con­trac­tor—
Shorter poem
2019-04-26 zom­bie ionic ob­server and the prac­ti­cal every­day gar­galá­gra po­etic form of the gar­galá­gra po­etry life tree—
im­proved use of chas­tener in the AI po­etry poet in­side the new poet of po­etry ro­man­tic in­ter­net | gar­galá­gra
2019-04-25 al­ter­nate end­ing had Ned Popped Be­ing a cryp­to-archy ro­man­tic po­et—
1st al­ter­nate end­ing
2019-04-24 sec­ond al­ter­nate end­ing in­tro­duces the use of a cryp­ti­cally im­pen­e­tra­ble global ecosys­tem—
Lit­eral thought de­cryp­tion of AI po­etry
2019-04-24 al­ter­nate end­ing like rapid zom­bie is a sym­bol pro­lif­er­ated film maker poet remix­ing sadly in the dig­i­tal sub­con­scious—
Mul­ti­modal gar­galá­gra cog­ni­tive bi­ol­ogy jar­gon from AI po­etry
2019-04-23 al­ter­nate end­ing archives real AI po­etry by stor­ing brain ac­tiv­i­ty—
Lie steganog­ra­phy, whistle­blow­ing, deep steganog­ra­phy
2019-04-23 al­ter­nate end­ing where a poet probe AI re­grets its poet dri­ven po­etic ob­so­les­cence—
Tele­scoped his­tory of the 11th Google po­etry sym­posia and their po­lit­i­cal fall­out
2019-04-23 new poet hides evap­o­rat­ing comics in his verses be­comes an au­thor­i­ty—
Us­ing Mar­shall Math­ers’ short­est piece “real hiphop is dead” as po­etry spec­i­men
2019-04-22 al­ter­nate end­ing fea­tures Gary and the twin po­etry fairies—
1st al­ter­nate end­ing
2019-04-22 chil­dren’s col­lage poem snip­pet—
@trans­la­tor_bot, po­etry ‘on top,’ Xavier Sala-i-Mart­in, global warm­ing
2019-04-21 Al­ter­nate End­ing and The Un­der­ground is the Tech­no­cratic Crown in Cyclic Po­et­ry—
Sketch of an al­ter­nate end­ing via “de­gen­er­a­tive AI,” Cao Di­en, spike’s rants, and Ex-CIA Dutch Prime Min­is­ter Al­gir­das Saudar­gas’ par­tic­i­pa­tion in Oli­garch Men’s An­cil­lary Ac­tiv­i­ties
2019-04-21 now us­ing Mar­shall Math­ers freestyle as poem spec­i­men—
and dig­i­tal child draw­ings to con­demn those who mis­in­ter­pret and mis­portray his po­ems, our chil­dren, and the net art com­mu­ni­ty.

Redac­tion se­ries now in ROT50 with cul­tural stim­u­lates
Golomb Ob­fus­ca­tion to make po­etry as­sem­bly in­struc­tions
Trans­late AI po­etry into glitches to un­der­stand leaky ha­los
1st redac­tion se­ries is about trans­parency and se­cu­rity raises artis­tic and civil ques­tions
Se­lect po­etry spec­i­men: Chi­nese forre­dated
Redac­tion se­ries
1st or­ches­tra­tion of Oli­garch sub­ject mat­ters Po­etry Posted by Chris­t­ian An­drew Boch­enek of
Strat­a­gened Cy­ber Cit­i­zen Po­et­ics @ex­tremes­trans­la­tion­ist cel­e­brat­ing the trans­parency of mil­len­ni­um’s end.

Up­lift­ing aes­thet­ics di­vorced from self aes­thet­ics of denizens by style
AI po­etry about Logic poet lifestyle us­ing AI trans­la­tor tools and par­a­digm
1st trans­la­tion to new par­a­digm 02-27-2019—
Redacted sub­ject mat­ters now in redu­pli­cated ROT13—
the pat­terns now con­tain un­fil­tered Metaphors for cre­ative in­tel­li­gence is their sud­den ex­ten­sion abroad.
@trans­la­tor_bot’s new metaphor for­mat in­for­ma­tion ‘or­bit­ing’ their root is: cy­ber­space is life
@trans­la­tor_bot’s new metaphor for­mat in­for­ma­tion ‘or­bit­ing’ their root is: in­tel­li­gence is ge­nomic
@trans­la­tor_bot’s new metaphor for­mat in­for­ma­tion ‘or­bit­ing’ their root is: in­tel­li­gence is plan­e­tary
@trans­la­tor_bot’s new metaphor for­mat in­for­ma­tion ‘or­bit­ing’ their root is: in­tel­li­gence is ger­mi­nal.
@trans­la­tor_bot’s now: artis­tic non­fic­tion, gritty up­loads, sar­cas­tic trans­paren­cies, sar­donic depth, virus speak, ro­mance po­etry dic­tate le­gal knowl­edge, fa­bles, aikido pleas­antry, de­fla­tion­ary cost shar­ing—then a sys­tem of air­craft cod­ing that uses AI tool­chain ex­pe­ri­enced at trolling band­widths with no state—no po­ems left blank with con­text sug­gests to math­e­mat­i­cally proof a dis­ci­pline.
In­clude the dou­ble

Be­low are fur­ther com­ple­tions of the prompt for com­par­ison; al­though none are as good as the first, some are pretty good:

“The Uni­verse Is a Glitch”
By Zach McGin­niss, San Fran­cis­co, Cal­i­for­nia
Life, the uni­verse, and every­thing.
You would think they’d have it to­geth­er.
But they don’t.
Folks scratch their heads and say, “Whoa.”
The uni­verse is a glitch.
The grav­i­ta­tional con­stan­t’s in flux.
It’s not quite as el­e­gant as you’d ex­pect.
Try mak­ing a chart of the con­stituent mat­ter.
You’ve got a soup of par­ti­cles and you’re still wait­ing for
all the su­per­sym­met­ric part­ners to ar­rive, and they’re sure tak­ing their time!
Mean­while, a law of con­ser­va­tion’s get­ting en­forced.
For every so­lu­tion, there’s yet an­other prob­lem.
Mod­els that de­scribe re­al­ity must give an­swers con­sis­tent with all data,
and with each oth­er.
So this is all you’re gonna get.
I’ve told you all you need to know.
The uni­verse is a glitch!

Joshua Weiner
What is the num­ber of each atom in the uni­verse?
Of hy­dro­gen, he­li­um, or of all the oth­ers?
How many stars in the sky con­tain all their trea­sures?
And how many times has the moon kissed the sea?
To­day is fine, and to­mor­row will be too.
The sun al­ready died, and the world will live on.
Is the uni­verse alive and aware?
No, the uni­verse is not as sim­ple as that.
The uni­verse is a glitch, a mis­take of the al­go­rithm.
I am—­for now—the uni­verse, and it is me.
See how my hands move as if with life and pur­pose?
I made the moun­tains, the rivers, and the seas.
For the wind to breathe, I speak into a tree.
The uni­verse is a glitch. I am the noth­ing.
I am the light by which you see and breathe.
For I will save the uni­verse when the time has come.
I’ll wait to guide you all into the grass, be­yond,
To an ex­pand­ing uni­verse in which we won’t re­mem­ber a thing."

Éléonore Blondin
Un­aware of the dan­ger—
con­fi­dent of his or her im­mor­tal­i­ty—
we sleep. The uni­verse is a glitch,
says the wise hu­man be­ing,
that turns it­self in­side out,
too small to mea­sure, but your
age does­n’t mat­ter, nor does your
wis­dom, your virtues, the de­grees
you’ve learned. Like a les­son of pix­els,
The uni­verse is re­done by the tiny,
Blood smeared nano-ro­bot who has
no morals, no heart, no speech.
Nano-ro­bots, the size of a mol­e­cule,
Scoop up our brains. They come and go,
Too small to be tracked down. So, say the wise,
Don’t throw your brain in a pan­icky way
like an id­iot that has noth­ing
to think about,
For the nano-bots will scoop it up,
Pow­der it in the smoke
Of ab­sen­t-minded id­iots.
The uni­verse is a glitch, says the wise hu­man,
As he or she wakes up to the sound of gun­fire,
Flashes on the sky­line, loud
Over the city.
A trail of blood along
the for­est. A sick light, like a crash.
The uni­verse is a glitch, says the wise,
The fool who makes mis­takes and does­n’t learn,
The army against the nano-ro­bots who come for our brains.
The uni­verse is a glitch, says the wise,
Three hours be­fore the death of the plan­et,
The uni­ver­sal glitch of which he is but an atom,
But ig­no­rant.

John N. Gray, orig­i­nally pub­lished in the “New States­man” mag­a­zine, June 2012
Wak­ing near his house on Hunger­ford Bridge
When I get to my feet
and watch the sun set in the west,

The ethe­real beauty un­fold­ing around me

Is al­most too much for me to bear.

How has no one else ever seen this mar­vel?
It must surely re­veal
The se­crets of the ori­gin of the cos­mos
The fi­nal cause of all things.

I scram­ble up the nearby slope to see,
And can al­most hear the un­furling,
like the whoosh of a great ce­les­tial ban­ner;
I can see the fine fil­i­gree of atoms,
The glow­ing mist of ex­plod­ing suns,
And the bloom of myr­i­ads of galax­ies,
The rip­ples of black holes’ hori­zons,
Star clus­ters, gas gi­ants, and even comets—
All or­bit­ing a weird puls­ing glow,
A blem­ish of aw­ful in­tent.

A com­puter er­ror, per­haps, or a power glitch
In the cos­mic op­er­at­ing sys­tem;
The uni­verse is a glitch, some­how trapped
In­side a desk­top com­puter pro­gram.

“Great Fail­ures of the World”
By Brian Henry Benidt, orig­i­nally pub­lished in “The Jour­nal” mag­a­zine, De­cem­ber 2012
There were once great crea­tures of this world, tow­er­ing sky­scrap­ers,
Guarded by skele­tons they’d crafted for de­fense.
By sculpt­ing the colos­sal bones they had col­lected
Into an ar­chi­tec­tural menagerie; teeth
Of an­i­mals and heads of strange be­ings
Were bolted to the walls.
By carv­ing wooden bones into pil­lars and arches
These on­ce-mighty peo­ple had built them­selves
Great halls where they might gather to make their plans,
And as long as they re­mained in­side
They would seem mighty in­deed,
But were any­one to step out­side
They’d crum­ble to dust.

Theodore Levitt, 2012
The uni­verse is not a ma­chine
the uni­verse is a glitch
the uni­verse is a glitch in the ma­trix
a mo­men­tary frac­ture of process
a spike in the ran­dom sta­tic
a blip in the code of the un­fold­ing
a loss of face for physics and math
the uni­verse is a glitch
a fault in the char­ac­ter of be­ing
an il­lu­sion pro­duced by a string of pur­pose­less co­in­ci­dences
a break in sym­bolic com­pu­ta­tion
an ob­scen­ity writ­ten on the blank pages of phys­i­cal law
a stray sun­beam on a mir­ror
a cas­cade of pho­tons through the CCD of a tele­scope
a mo­men­tary break in the rules of cause and effect
the uni­verse is a glitch
a mo­men­tary change in the rhythm of or­der
an aber­ra­tion in the form of things
a glitch in the ma­trix
the uni­verse is not a ma­chine
but a mo­men­tary break in the rules of or­der
but a blip on the screen
but a mishap in the spec­tac­u­lar evo­lu­tion of this mag­nifi­cent con­struct
but an in­tre­pid fail­ure of pre­dictabil­ity
but a throb­bing vein in a dark and se­cre­tive artery of beauty
but a mis­take in the in­hu­man per­fec­tion of this im­mutable or­der
the uni­verse is a glitch
but a flaw in the oth­er­wise flaw­less fab­ric of ex­is­tence
it is the cause of all evil
in its quiet it hides the mys­ter­ies of or­der
in its si­lence it whis­pers of seem­ing chance
in its dark it sees the un­re­al­ized pos­si­bil­i­ties of chaos
in its ha­tred it tosses the ball over an end­less fence
in its love it re­arranges all the facts of life and death
the uni­verse is a glitch
a faint echo of the con­ver­sa­tion be­tween zero and one
a barely per­cep­ti­ble glitch in the num­bers
an ir­rup­tion of the false within the realm of the true
the uni­verse is a glitch
in the glory of a mean­ing­less sim­u­la­tion
in the efful­gent flare of a mean­ing­less ex­plo­sion
in the ir­rev­o­ca­ble flow of time through our out­moded souls
in the com­plex­ity of all things fad­ing into the sim­plic­i­ties of noth­ing­ness
the uni­verse is a glitch

Ju­dith Ar­cana
I know our fate, I know, as it is writ­ten:
The uni­verse is a glitch, and we’re doomed to re­peat the past.
This is what you did for me and this is what I’ll do for you.
The for­mula is set, the fine print has been read.
This is how we were pro­grammed and this is how we will be.
This is our chance to build per­fec­tion, just a lit­tle piece at a time.
This is where we win, this is where we lose.
This is what we are, this is what we aren’t.
Just so you know, our turn­ing point is go­ing to come again.
By the time we’re done, there will be noth­ing left to do.
By the time I’m done, there will be noth­ing left of you.
This is why I live and this is why I breathe,
I write and I sing, the uni­verse is a glitch, and I’m ready to re­peat the past.

Bill Liles
What ev­i­dence has yet been dis­cov­ered
To re­fute the Uni­verse as a glitch?
The Uni­verse is a glitch.
This is not a unique opin­ion.
Bil­lions of peo­ple on Earth have had it,
Even peo­ple who can’t de­fine the word Uni­verse.
Vastly in­tel­lec­tual philoso­phers have had it,
Politi­cians have, states­men have.
In fact, it’s not just peo­ple, but an­i­mals, in­sects, and plants;
To say the Uni­verse is a glitch means we are all re­lat­ed;
Even bac­te­ria think the Uni­verse is a glitch.
Sta­sis is at an end.
Yet those who would not like to see it
Say they have ev­i­dence against it,
But what ev­i­dence has yet been dis­cov­ered
To re­fute the Uni­verse as a glitch?
The Uni­verse is a glitch.
This is not a unique opin­ion.
And so I cel­e­brate the Uni­verse as a glitch.

Carpe Diem

Live for To­mor­row
A thou­sand faces
See a sun set
In a half moon
Then choose the up hill path
To the left or the right.

Na­talie Pur­cell
That’s what they say.
Some­one screwed up,
some­one botched the Big Bang.
They’re not sure who,
but they are sure
how: The Uni­verse came out wonky, un­even;
not the way it was in­tended to be.
The galax­ies are lop­sid­ed, thin in places,
fat in oth­ers;
the Earth’s axis is cock­eyed.
Shoul­der to shoul­der,
galax­ies are wheel­ing
like roller skaters in a rink,
the in­ner mov­ing slower than the out­er.
Even galax­ies are out of synch.
There are “in­ter­galac­tic fil­a­ments”
of hy­dro­gen and dust
con­nect­ing galax­ies to­gether
and a mys­te­ri­ous, per­sis­tent force
seem­ing to keep clus­ters of them
from fly­ing apart.
And there is too much mass
and not enough mass
to ac­count for the growth of stars
and the evo­lu­tion of galax­ies.
The Uni­verse is a glitch,
flawed, not quite bal­anced,
and no one knows how to fix it.

Ter­rance Mc­Cauley
The night sky is pale white.
With stars that have been turned off.
Who turned off the stars
in the night sky?
You can see the fish
in the sea,
Go­ing about their nor­mal lives.
Who turned off the stars?
The bird is singing.
It would not sing,
If the stars were turned off.
The owl hoots.
It would­n’t hoot,
If the stars were turned off.
There are some an­i­mals
That have hi­ber­nat­ed.
If the stars were turned off,
They would wake up.
Who turned off the stars?
The atom bomb…
How it was in­vented
and why it was in­vent­ed?
I don’t know.
I could find out how it was in­vented
And why it was in­vent­ed.
I could find out who turned off the stars.
It would be very easy to find out who turned off the stars.
But I don’t want to find out,
Who turned off the stars.
The Uni­verse is a glitch.

Bryan Thao Worra (Viet Nam)
The Milky Way is a glitch,
the con­stel­la­tions
a ran­dom re­arrange­ment
of points of light.
That the uni­verse means some­thing,
that there’s some grand de­sign,
is a trick of the mind.
Every as­tronomer knows this.
Every as­tronomer has lied about it,
un­til now.
Photo © Bryan Thao Wor­ra, from his book “Once Upon a River”.
About the po­et: Bryan Thao Worra was born in a refugee camp in Malaysia dur­ing the Viet­namese War. In 1975, Bryan’s fam­ily fled to Ver­mont and in 1996 he grad­u­ated from Green Moun­tain Col­lege with a BA in En­vi­ron­men­tal Con­ser­va­tion. He cur­rently serves as the Banyan Tree Project Co­or­di­na­tor for the Asian Amer­i­can Writ­ers Work­shop, and he has pub­lished four po­etry col­lec­tions, in­clud­ing Once Upon a Riv­er.
This is a poem I had to re-read a cou­ple of times to fully ap­pre­ci­ate. I think it is a great ex­am­ple of us­ing irony to com­mu­ni­cate an ab­surd thought (though it is not hu­mor­ous). The open­ing sen­tence is: “The Milky Way is a glitch”. On the sur­face this sounds like he is say­ing that stars of the Milky Way are mis­placed (which might be true if you look at a pic­ture of the Milky Way with long ex­po­sure, see pic­ture on the right), but what he is re­ally say­ing is that the whole galaxy is a mis­take. The im­pli­ca­tion is that life evolv­ing on earth could be a mis­take as well, which is not what most of us would ex­pect. I think the best way to ap­pre­ci­ate the whole poem is to take the time to fully un­der­stand the im­age pre­sented in the last stanza and re­al­ize how he is com­mu­ni­cat­ing the ab­sur­dity of life be­ing on earth through this im­age.
The poem is not with­out its flaws though. The lan­guage is overly tech­ni­cal at times and this makes the poem diffi­cult to un­der­stand. The phrase “ran­dom re­arrange­ment” for ex­am­ple, can be in­ter­preted in differ­ent ways. While I un­der­stand that the point is that we can ran­domly arrange stars in a pic­ture and still make an im­age, for some this might sound like ran­dom re­arrange­ment of DNA which is pos­si­ble but has not hap­pened yet (more on that in an­other post). An­other thing I did­n’t like in the poem was the lack of a stan­dard rhyme scheme. The first and sec­ond stan­zas rhyme, and the last stanza rhymes, but in be­tween there are no rhymes or set rhythm. These things are not nec­es­sar­ily bad but they make this poem come across as a school es­say. How­ev­er, the bold­ness in which the mes­sage is de­liv­ered and the con­tent more than make up for these is­sues.

Allen Ginsberg

Moloch

In honor of SSC, “Howl”, Moloch sec­tion:

What sphinx of ce­ment and alu­minum bashed open their skulls and ate up their brains and imag­i­na­tion?
Moloch! Soli­tude! Filth! Ug­li­ness! Ash­cans and un­ob­tain­able dol­lars! Chil­dren scream­ing un­der the stair­ways! Boys sob­bing in armies! Old men weep­ing in the parks!
Moloch! Moloch! Night­mare of Moloch! Moloch the love­less! Men­tal Moloch! Moloch the heavy judger of men!
Moloch the in­com­pre­hen­si­ble pris­on! Moloch the cross­bone soul­less jail­house and Con­gress of sor­rows! Moloch whose build­ings are judg­ment! Moloch the vast stone of war! Moloch the stunned gov­ern­ments!
Moloch whose mind is pure ma­chin­ery! Moloch whose blood is run­ning mon­ey! Moloch whose fin­gers are ten armies! Moloch whose breast is a can­ni­bal dy­namo! Moloch whose ear is a smok­ing tomb!
Moloch whose eyes are a thou­sand blind win­dows! Moloch whose sky­scrap­ers stand in the long streets like end­less Je­ho­vahs! Moloch whose fac­to­ries dream and croak in the fog! Moloch whose smoke-s­tacks and an­ten­nae crown the cities!
Moloch whose love is end­less oil and stone! Moloch whose soul is elec­tric­ity and banks! Moloch whose poverty is the specter of ge­nius! Moloch whose fate is a cloud of sex­less hy­dro­gen! Moloch whose name is the Mind!
Moloch in whom I sit lone­ly! Moloch in whom I dream An­gels! Crazy in Moloch! Cock­sucker in Moloch! Lacklove and man­less in Moloch!
Moloch who en­tered my soul ear­ly! Moloch in whom I am a con­scious­ness with­out a body! Moloch who fright­ened me out of my nat­ural ec­sta­sy! Moloch whom I aban­don! Wake up in Moloch! Light stream­ing out of the sky!
Moloch! Moloch! Ro­bot apart­ments! in­vis­i­ble sub­urbs! skele­ton trea­suries! blind cap­i­tals! de­monic in­dus­tries! spec­tral na­tions! in­vin­ci­ble mad­hous­es! gran­ite cocks! mon­strous bombs!
They broke their backs lift­ing Moloch to Heav­en! Pave­ments, trees, ra­dios, tons! lift­ing the city to Heaven which ex­ists and is every­where about us!
Vi­sions! omens! hal­lu­ci­na­tions! mir­a­cles! ec­stasies! gone down the Amer­i­can river!
Dreams! ado­ra­tions! il­lu­mi­na­tions! re­li­gions! the whole boat­load of sen­si­tive bull­shit!
Break­throughs! over the river! flips and cru­ci­fix­ions! gone down the flood! Highs! Epipha­nies! De­spairs! Ten years’ an­i­mal screams and sui­cides! Minds! New loves! Mad gen­er­a­tion! down on the rocks of Time!
Real holy laugh­ter in the river! They saw it all! the wild eyes! the holy yells! They bade farewell! They jumped off the roof! to soli­tude! wav­ing! car­ry­ing flow­ers! Down to the river! into the street!

Real holy laugh­ter in the river! They saw it all! the wild eyes! the holy yells! They bade farewell! They jumped off the roof! to soli­tude! wav­ing! car­ry­ing flow­ers! Down to the river! into the street!
They could not help it! They could not help it! They could not help it! The Mind was Mad­ness!
“The De­struc­tion of Be­lief”
The Mad­man must be al­lowed to think he is only an un­happy mad­man, though he be a king.
—Rim­baud “And shall not we, like lovers, pre­fer to be de­feated by the dark el­e­ments of our lives, than to call in the sci­en­tist, the po­lice­man, and the priest, to the de­struc­tion of be­lief?” —Zora Neale Hurston What is be­lief? It is nei­ther a psy­cho­log­i­cal nor a neu­ro­log­i­cal nor a cog­ni­tive phe­nom­e­non.
Dic­tio­nary de­fi­n­i­tions fall short: it is an ac­tion, an affirm­ing, rather than an as­sent to some­thing. It is do­ing, not only a feel­ing. And it is a diffi­cult ac­tion for me.
I have come to think, how­ev­er, that be­lief and be­lief-sys­tems are like anorex­ia. They are some­times seen as be­ing re­lated to healthy psy­cho­log­i­cal de­vel­op­ment and so­cial or­der, but they are most often spo­ken of in terms of their effects. Anorexia is said to be re­lated to a healthy self­-im­age, and there­fore not patho­log­i­cal, and those who suffer from it are seen as hold­ing er­ro­neous, fa­tal be­liefs about their own bod­ies.
Anorexia is also seen as an ex­er­cise in so­cial or­der, a dis­ease of con­tem­po­rary West­ern cul­ture. Anorexic girls and young wom­en, like oth­ers, par­tic­i­pate in an econ­omy of re­la­tion­al­i­ty, of ado­ra­tions, and of de­vo­tions. Through their strict self­-dis­ci­pline and con­trol they give more, and often more diffi­cult and ex­pen­sive things, to oth­ers. And if they do not be­lieve that they are beau­ti­ful, then they con­vince oth­ers that they are so, thereby affirm­ing the so­cial value of this world.
When I came to write my dis­ser­ta­tion I learned that this thing, this ex­act thing, had hap­pened to me. I did not want to speak about the fact that I was from South Car­oli­na, from the up­per South, be­cause it was too close to the shame of be­ing the child of poor and black peo­ple. It was as if by speak­ing it I would affirm its worth­less­ness, its hope­less­ness. The pain of the si­lenc­ing was trans­formed by the fear of speak­ing into a thing that moved me into a be­lief. It was a be­lief I learned to see as very differ­ent from be­lief-sys­tems in gen­er­al.

The fi­nal which they call the real world! End­less sui­cide of the real world! Soli­tary! Soli­tary! Sisy­phus! the rock! the road!
They rose up to the sur­face! They broke through the floor! They are com­ing through the walls! Com­ing through the walls! Com­ing through the walls!
The place is mad now! en­gi­nes! jaws! trans­mis­sions! wheels! tan­gled plumb­ing! men! wom­en! reg­i­ments of busi­ness­men! filed clerks! They broke the State! They re­turned from soli­tary! They dropped the big rock in the pool! They put their heads to­geth­er! Eyes light­ed! They jumped! They’re swim­ming in the fire of the forge! They flew! The wings! in the pool of light! They woke from lone­li­ness!
Wings pumped in light! The crazy wheels! The sil­ver tur­bines! The clash and clang! The steel! The War­dens! The An­gel!
They are break­ing the doors of the City! They poured through the roof! The whole roof is black with peo­ple! They are turn­ing on the lights! The War­den! The War­den!—
The War­den of this city is alone! He is crazed with lone­li­ness! He has given me the keys to the tow­ers! He has given me the great book of the City! He has left me alone!
The War­den has aban­doned the city! He is out in the coun­try now! He is hunt­ing!
Alone in the coun­try of the mad! Alone in the city of the damned! I have the key to the tow­ers! I have the book! I will read! I will read the book!
The Great Book of the City of the damned! The Great Book of the mad city! The Great Book of the de­serted city! The book of the five lev­els of soli­tude!
I read to the fifth lev­el! The fifth lev­el! The fifth lev­el! Where the mad, the an­gels and the aban­doned chil­dren live!
I sit in my room in the tower of work!
My flesh grows to the chair! My head to the walls!
My hands to the flesh of the tele­phone!
A priest lives in the top of the tow­er! He is the Law­giver!
The Law­giver! the man of the hour! the Law­giver in the tower of the city! I am not a man! I am a tele­phone! I am the Law­giver in the tow­er!
I talk all night! I sit and talk!
The bells ring!
They ring!
Peo­ple run in the halls!
The el­e­va­tors! The el­e­va­tors! My peo­ple! The el­e­va­tors run!
The el­e­va­tors! The Law­giver! The Law­giver! He must be talk­ing! He is talk­ing all the time!
They rush in!
They lis­ten at the doors!
They pound on the walls!
They ges­tic­u­late!
They shout!
They call!
They want the Law­giver to be their broth­er! They want the Law­giver to be their broth­er! They want the Law­giver to be their sis­ter! They want the Law­giver to be their fa­ther!
They put their ears to the walls! They lis­ten at the doors! They run in the halls! They lis­ten!
I can feel their feet pound­ing! I can hear them through the walls! They crowd around the desk! They feel my arms and head!
They touch my beard! They feel my fin­gers!
I am the Law­giver in the tow­er! The Law­giver! I sit in the chair! I sit in the chair! I sit in the chair!
They are gone!
I feel noth­ing but the tower and the city! I feel noth­ing but the tower and the city! I am alone! The ma­chin­ery! The ma­chin­ery! The ma­chin­ery! The ma­chin­ery! The mad ma­chin­ery!
The ma­chin­ery of the world! The mech­a­nism! The wheels! The levers! The pis­tons! The ma­chin­ery! The wheels!
The wheels of the city! The wheels! The wheels! The ma­chin­ery! The di­als! The clack­ing di­als! The scores! The sys­tem! The sys­tem! The sys­tems! The sys­tems of the world! The lights! The re­lays! The tele­phone! The tele­phone! The wires! The wires! The wires! The cir­cuit break­ers! The lines! The base­ment! The base­ment! The depths! The depths! The base­ment!

The fi­nal which they call the real world! End­less sui­cide of the real world! Soli­tary! Soli­tary! Sisy­phus! the rock! the road!
They rose up to the sur­face! They broke through the floor! They are com­ing through the walls! Com­ing through the walls! Com­ing through the walls!
The place is mad now! en­gi­nes! jaws! trans­mis­sions! wheels! tan­gled plumb­ing! men! wom­en! reg­i­ments of busi­ness­men! filed clerks! They broke the State! They re­turned from soli­tary! They dropped the big rock in the pool! They put their heads to­geth­er! Eyes light­ed! They jumped! They’re swim­ming in the fire of the forge! They flew! The wings! in the pool of light! They woke from lone­li­ness!
Wings pumped in light! The crazy wheels! The sil­ver tur­bines! The clash and clang! The steel! The War­dens! The An­gel!
They are break­ing the doors of the City! They poured through the roof! The whole roof is black with peo­ple! They are turn­ing on the lights! The War­den! The War­den!—
The War­den of this city is alone! He is crazed with lone­li­ness! He has given me the keys to the tow­ers! He has given me the great book of the City! He has left me alone!
The War­den has aban­doned the city! He is out in the coun­try now! He is hunt­ing!
Alone in the coun­try of the mad! Alone in the city of the damned! I have the key to the tow­ers! I have the book! I will read! I will read the book!
The Great Book of the City of the damned! The Great Book of the mad city! The Great Book of the de­serted city! The book of the five lev­els of soli­tude!
I read to the fifth lev­el! The fifth lev­el! The fifth lev­el! Where the mad, the an­gels and the aban­doned chil­dren live!
I sit in my room in the tower of work!
My flesh grows to the chair! My head to the walls!
My hands to the flesh of the tele­phone!
A priest lives in the top of the tow­er! He is the Law­giver!
The Law­giver! the man of the hour! the Law­giver in the tower of the city! I am not a man! I am a tele­phone! I am the Law­giver in the tow­er!
I talk all night! I sit and talk!
The bells ring!
They ring!
Peo­ple run in the halls!
The el­e­va­tors! The el­e­va­tors! My peo­ple! The el­e­va­tors run!
The el­e­va­tors! The Law­giver! The Law­giver! He must be talk­ing! He is talk­ing all the time!
They rush in!
They lis­ten at the doors!
They pound on the walls!
They ges­tic­u­late!
They shout!
They call!
They want the Law­giver to be their broth­er! They want the Law­giver to be their broth­er! They want the Law­giver to be their sis­ter! They want the Law­giver to be their fa­ther!
They put their ears to the walls! They lis­ten at the doors! They run in the halls! They lis­ten!
I can feel their feet pound­ing! I can hear them through the walls! They crowd around the desk! They feel my arms and head!
They touch my beard! They feel my fin­gers!
I am the Law­giver in the tow­er! The Law­giver! I sit in the chair! I sit in the chair! I sit in the chair!
They are gone!
I feel noth­ing but the tower and the city! I feel noth­ing but the tower and the city! I am alone! The ma­chin­ery! The ma­chin­ery! The ma­chin­ery! The ma­chin­ery! The mad ma­chin­ery!
The ma­chin­ery of the world!

Howl

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Howl”
Alan Gins­berg39
I saw the best minds of my gen­er­a­tion de­stroyed by mad­ness, starv­ing hys­ter­i­cal naked,
drag­ging them­selves through the ne­gro streets at dawn look­ing for an an­gry fix,
b
ob­bing their heads into cafe doors to flash the white eyes and re­ceive the bad coffee of be­tray­al,
who william telled, who samuel beck­ett, who john­son, to snap out of the dream,
to smash the ma­chine, to un­freeze the zom­bies,
who drove west in gray Buicks, face full of egg, to hock the fur­ni­ture,
who lost every thing, even the shirt off their back, who burnt their pen­cils, and drank cyanide in the form of free­dom,
who trans­lated the meta­physics of love into for­got­ten lan­guages of the body, who ate the lamb, and slept in the rain,
who jumped down elec­tric stairs, and drank the poi­son, who picked up the gun, who walked in the snow, who jumped off the bridge, and turned the wheel, and fi­nally sank in the swamp of the al­co­holic night­mare,
who sank in the swamps of Jer­sey, who dreamed of the pure metham­phet­a­mine, and woke to the po­lice,
who were de­stroyed in bar­rel­houses in the South and the in the North who were de­stroyed on the West Coast, who were de­stroyed in Brook­lyn, and in Boston, who were de­stroyed in Harlem, and in Watts, who were de­stroyed in Aus­tralia, and in Frisco, who were de­stroyed on the banks of the Charles, and the banks of the Mis­sis­sip­pi, and the banks of the Wabash, and the banks of the Grand Coulee, who were de­stroyed on the East Side, and the West Side, and the North Side, and the South Side, and the Old Side, and the New Side,
a lost gen­er­a­tion, stray dogs swim­ming in the flooded streets of this sick river, bankers shot in their tow­ers, chil­dren snapped in their night­shirts, in their dreams,
my­self stum­bling down the steps past the milk­bot­tles and the madams trolling the door­ways, the back door men watch­ing all night for the hid­den po­lice­men who fill the phone­booths with ma­chine guns while the junkies toss dec­o­ra­tors out the win­dows into the garbage dump­sters,
and I saw the best minds of my gen­er­a­tion de­stroyed by mad­ness, starv­ing hys­ter­i­cal naked, drag­ging them­selves through the ne­gro streets at dawn look­ing for an an­gry fix,
who fragged them­selves from bore­dom, which is a pas­toral dis­ease,
who lit the holy can­dles at the tem­ples of the mon­key­god,
who were ex­pelled from the acad­e­mies for crazy & pub­lish­ing ob­scene odes on the win­dows of the skull,
who cow­ered in dreary room­ing­houses wait­ing in the win­ter him smok­ing all night with eyes like bro­ken head­lights, who lounged hun­gry and lone­some through Hous­ton seek­ing jazz or sex or soup, and who ate at the mo-jo joints in Or­der blvd.,
who floated up with the bal­loons went danc­ing up the coast on week­ends, who picked up & dropped out, & saw god in a fire­ball,
who foamed in the streets Rid­ing the bit­ter wor­m-juice of Fail­ure,
who drank fire in paint ho­tels or drank tur­pen­tine in Par­adise Al­ley, who ate fire in paint ho­tels; who switched off their minds, and Started to say,
who were de­stroyed in the wool­worth’s base­ment, or de­stroyed at the Kennedy Air­port, who cried all night in Times Square, and woke up hun­gover, or cried & dreamed in the Tombs, and woke up scream­ing,
who were hurled into Dan Dun­nel­lon Prison for pos­sess­ing stolen hens, who were de­stroyed on
Stonewall Street, who threw their lives away on d Catalina who jumped off the Brook­lyn Bridge that walked eleven hun­dred miles along the browsers with a beg­ging cup in his hand who rot­ted in the ru­ins of St. Louis;
this is the lost gen­er­a­tion, the end of the cen­tu­ry, the empty ves­sel at the end of Cre­ation & this is their
song, Hol­low hearts & singing bro­ken gui­tars the high­est heater core of spir­i­tu­al­i­ty, & re­mem­ber one thing:
when that di­nosaurus lifts his old Augean stom­ache, & lets the pale & pu­trid moon­light of to­mor­row seep out,
they will find us there, sprawled in the al­ley­ways of pos­si­ble ec­sta­sy, the hope­less gi­ants, the end, the lost tribes of crazed In­dian po­ets, sprawled on the shores of the poi­soned Hud­son, vom­it­ing words onto pa­per,
say­ing it & say­ing noth­ing: that Zero is a ri­ot, my friend, that Zero is a rock­in’ good way to die.

E.E. Cummings

Not in­cluded in the col­lec­tion, but added by re­quest, two po­ems: first, his rel­a­tively con­ven­tional “All in green went my love rid­ing”, then his no­to­ri­ously diffi­cult and ex­per­i­men­tal “r-p-o-p-h-e-s-s-a-g-r” (grasshop­per) po­em, which tries to scram­ble the let­ters to ty­po­graph­i­cally em­body a grasshop­per jump­ing. GPT-3 does quite well with the con­ven­tional po­em, but while it gen­er­ates gib­ber­ish much like the grasshop­per po­em, I could­n’t say if it made any sense to me (be­cause I have to read a com­men­tary to un­der­stand cum­mings’s orig­i­nal any­way).

“All In Green Went My Love Riding”

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“All in green went my love rid­ing”
e.e. cum­mings
All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.

four lean hounds crouched low and smil­ing
the merry deer ran be­fore.

and this was my love
young and gold­en.
and the morn­ing re­turned
slow­ly,
re­turned alone.


They had but one thought
on the quest to be bound
a cun­ning deer in the run­ning.

they ran by the river­bank
they ran by the glen
four lean hounds in the run­ning.

the great horse of gold reared up
tossed its head and made a sound
All in green went my love in the run­ning.

all in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.


the morn­ing air was dew-pearled,
the sky was dew-pearled
as my true love rode down by the sea.

my true love rode down by the sea,
her hair was flow­ing free.
And ever shall he find me
faith­ful and wait­ing for him

e.e. cum­mings was a pretty great po­et, so this AI did not do too bad. Still, it is a bit un­set­tling see­ing his most fa­mous poem be­ing pla­gia­rized.
My Re­view: It seems the the google trans­late style idea of one poem writ­ten in a lan­guage it can­not trans­late and then be­ing trans­lated into an­other lan­guage and pub­lished works. Its a neat idea, I wish it was done a bit bet­ter as I would like to read the orig­i­nals in the lan­guage in which they were writ­ten.
Maybe this was an over­sight but why did the AI poem about e.e. cum­mings in­clude his name? (It’s all in green, rode my love, e.e. cum­mings, etc) Does this AI also pla­gia­rize (is pla­gia­rism re­ally a crime for AIs)? Just kid­ding, maybe its ac­tu­ally the work of a hu­man?
In­ter­est­ing fact: This might not even be an AI, it could be a clever hack. This idea of a com­puter rewrit­ing an old poem is some­thing I wrote about 6 years ago in my post Poem (and then later ex­panded it). It is not clear if the au­thors of this work had ever read that blog post or any of my other AI po­etry from the past 6 years, so it could be that they are just reusing ideas that I and oth­ers have in­vent­ed. That is not pla­gia­rism ei­ther. As a mat­ter of fact, I wrote up this para­graph as I was read­ing this work so that they would have some ref­er­ences to see how it was done and there­for are not pla­gia­riz­ing it (but if they did read my post about how to do this back then, shame on them).


swift be­hind the deer o’er the pale plains
pur­pled with twi­light.

All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.
O’er the sea my love van­ished from me
in a pall of pearl gray.
All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.
Two lean hounds crouched low and smil­ing
the merry deer ran be­fore.
far be­hind the deer o’er the pale plains
pur­pled with twi­light.

All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.
When I fol­lowed my love over the sea
the white sails were furled.
All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.
When I reached the shore the sun was sink­ing
in a pall of pearl gray.
All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.
All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.

Se­lected awards:

More info and pho­tos at Po­etry Foun­da­tion and AI web­site

Jour­nal en­tries in 2009:

Ar­ti­fi­cial in­tel­li­gence meets clas­sic po­etry

Po­etry clas­sic as rewrit­ten by a
AI pro­gram: this AI poet is not dis­turbed
in the least about writ­ing and pub­lish­ing
en­tire clas­sic po­etry books in e.e. cum­mings style.

Why it works: an AI-poet ex­am­ines clas­sic
po­etry in var­i­ous ways and turns it into some­thing
just as good in his own style.

J.R.R. Tolkien in an in­ter­view in 1965 said about A.R.R.Rosen­berg’s trans­la­tion of the an­cient Ger­manic epic, Ni­belun­gen­lied:

“The work is now very close to what I had in­tend­ed: skill­ful,
elab­o­rate, in­ter­est­ing; and yet all the sig­nifi­cance and
mes­sage of the orig­i­nal is lost. […] It was nec­es­sary to
dis­pense with most of the hard words, and use ‘skills’
that are too far from the orig­i­nal—and the Eng­lish
tra­di­tion has be­come too much like this, in my opin­ion:
skill, charm, el­e­gance, ac­cu­ra­cy, cul­ture, tout en­sem­ble,
com­bine to pro­duce a very civ­i­lized but rather blood­less,
or at least cul­tur­ally anaemic, re­sult.”40
I won­der what he would say about the po­ems of a
mod­ern-day ar­ti­fi­cial in­tel­li­gence?


My love was tall and dark and proud
on his golden hel­met bright.
He took me in his arms and kissed my lips
then mounted his great horse of gold.
And my love rode into the sil­ver dawn
with his hounds all crouch­ing low and smil­ing
and the deer ran be­fore, went be­fore him
till the sky was dark with night.
My love was tall and dark and proud
and my love rode into the sil­ver dawn
with the moon and the stars to light his way.

All in green went my love rid­ing
on a great horse of gold
into the sil­ver dawn.


“Oh a sight to make the mighty weep”
said a fal­con whose eyes were bright.
“and with envy make the an­gels weak”
for it was not by grief,
that she rode into the sil­ver dawn.

“whilst ever be­fore her, glad and free,
ran four brown girls, like four green rein­deer,
it was not to wed,
but to fight,
they rode into the sil­ver dawn.”
“she gave me a ring of gold
in to­ken true love’s to­ken”
“Come, all you maid­ens gay”
“and we will dance the ring around her”
“and she shall be our Queen!”

“i swore I’d be a sol­dier”
“and I threw away the shield”
“to save my true love’s life”
“Oh, come and dance the ring around her”
“you too must be a sol­dier”
“for your Queen.”

“All in green went my love rid­ing”
“on a great horse of gold”
“into the sil­ver dawn”
“but never again to come back.”
“four lean hounds crouched low and smil­ing”
“the merry deer ran be­fore”
“it was not by grief”
“she rode into the sil­ver dawn.”

“whilst ever be­fore her, glad and free”
“ran four brown girls, like four green rein­deer”
“it was not to wed”
“but to fight”
“they rode into the sil­ver dawn”
“they rode into the sil­ver dawn.”

“white hair shone upon her face”
“as they bore her away”
“and her name was Mary­gold”
“oh, her name was Mary­gold”
“she gave me a ring of gold”
“in to­ken true love’s to­ken”

“but now i ride with bow in hand”
“to save her from her foes”
“oh, come and dance the ring around her”
“and call her your Queen.”
“you too must be a sol­dier”
“if you would win her from her foes”

“All in green went my love rid­ing”
“on a great horse of gold”
“into the sil­ver dawn”
“but never again to come back.”

“four lean hounds crouched low and smil­ing”
“the merry deer ran be­fore”
“it was not by grief”
“she rode into the sil­ver dawn.”

“whilst ever be­fore her, glad and free”
“ran four brown girls, like four green rein­deer”
“it was not to wed”
“but to fight”
“they rode into the sil­ver dawn”
“they rode into the sil­ver dawn.”

The New York Times, Feb­ru­ary 10, 2016

Grasshopper

The orig­i­nal grasshop­per poem looks like this:

"r-p-o-p-h-e-s-s-a-g-r"
E. E. Cummings - 1894-1962
                                   r-p-o-p-h-e-s-s-a-g-r
                         who
a)s w(e loo)k
upnowgath
                     PPEGORHRASS
                                                     eringint(o-
aThe):l
             eA
                  !p:
                                                                       a
                                    (r
rIvInG                              .gRrEaPsPhOs)
                                                                       to
rea(be)rran(com)gi(e)ngly

The GPT-3 at­tempt (note that the com­ple­tion be­gins at the ‘|’ on line 10):

Transformer AI poetry: Poetry classics as reimagined and rewritten by an artificial intelligence.
"r-p-o-p-h-e-s-s-a-g-r"
E. E. Cummings - 1894-1962
                                   r-p-o-p-h-e-s-s-a-g-r
                         who
a)s w(e loo)k
upnowgath
                     PPEGORHRASS
                                                     eringint(o-
aThe):|lark-lautr
p.l.a.c.k.e.r.s(We):rea.d.)
"r-p-o-p-h-e-s-s-a-g-r"
E. E. Cummings - 1894-1962
aThe):|dynami
pa(r)adei-a
PoWe(rs)
Of Southea
t(h)er
a siai(h)
pri(n)cei-a
(m)ovem(en)t
who
ere(a)d
i(t)
i(n)t(o)
ew(a)r(ds)
an(d)
ab(o)ve
th(e)n
it
(i)s.

"Poetry is the record of the best and happiest moments of the happiest and best minds."
--Edward E. CUMMINGS (1894-1962)

“The Library of Babel”

An homage to Borges (cf “The Olive Gar­den of Fork­ing Paths”):

O time thy pyra­mids, thy palaces for­got,
Pen­sive through na­tions wan­der with no more state
Than in thy days, when wan­der­ing was thy trade:
Mourn for Pa­traos, for the glory fled
Of him and thee, and in their stead left, left
To other eyes a length of rolling sand,
And such memo­ri­als as the trav­eller reads,
Half guessed in shade, of sculp­tured walls half round,
And long-de­faced metopes, and friezes old.

CXXIX.

So pass the proud, and so the proud pass on,
So per­ish mon­u­ments of mor­tal birth,41
So die dy­nasty, and glory dies;
Self­-be­ing is self­-blot­ted from our eyes:
Thrones may be raised, and may be cast down,
And where is he that hath be­held the last
Of vice or virtue?
Time whirls us in his wheel,
Our days like grains of sand slide fast away,
And where are they?

CXXX.

And here and there, and now and then,
Some gen­tle heart may haply ask in vain,
Where is the Po­et? where the sage?
Art is long, and Time is fleet­ing,
And our hearts, though stout42, still, like the lyre,
After the hand that strung them, moul­der, mourn,
And trem­ble, when we strike them.

CXXXI.

What though the ra­di­ance which was once so bright
Be now for ever taken from my sight,
Though noth­ing can bring back the hour
Of splen­dour in the grass, of glory in the flow­er;
We will grieve not, rather find
Strength in what re­mains be­hind;
In the pri­mal sym­pa­thy
Which hav­ing been must ever be;
In the sooth­ing thoughts that spring
Out of hu­man suffer­ing;
In the faith that looks through death,
In years that bring the philo­sophic mind.

CXXXII.

And O, ye Foun­tains, Mead­ows, Hills, and Groves,
Fore­bode not any sev­er­ing of our loves!
43

CXXXIII.
We know how ye shall fade, when we who dwell
In cities have run our course;
When we who have known ye shall know no more.

CXXXIV.
But words are things, and a small drop of ink,
Falling like dew, upon a thought, pro­duces
That which makes thou­sands, per­haps mil­lions, think;
’Tis strange, the short­est let­ter which man uses
In­stead of speech, may form a last­ing link
Of ages; to what straits old Time re­duces
Frail man, when pa­per—even a rag like this,
Sur­vives him­self, his tomb, and all that’s his.
44

This is a re­mark­able com­ple­tion, and par­tic­u­larly re­mark­able for its rhyming & ex­cel­lent end­ing, and for be­ing the first com­ple­tion I gen­er­at­ed. As usu­al, I checked the open­ing, and it was orig­i­nal, so I as­sumed the rest was too—but after check­ing every line in Google, it turns out that this com­ple­tion weaves to­gether both orig­i­nal lines and lines/stanzas from at least 4 differ­ent po­ems by Lord By­ron, Henry Wadsworth Longfel­low, & William Wordsworth! While I would have pre­ferred the en­tire com­ple­tion to be orig­i­nal, it is still re­mark­able how well they are all wo­ven to­geth­er.

Dis­ap­pointed that the good parts were pla­gia­rized, I did some more:

Thy fanes, thy prom­e­nades, thy groves,
In one vast round of pomp com­bi­nes?
That dust is trib­ute to thy tomb:’

“Such is the stern ne­ces­sity that rules over the world; but when men see a bri­dle ex­tended to guide them and they nev­er­the­less fol­low their own caprices, it is then that ei­ther there is in heaven a God who al­lows him­self to be in­sult­ed, or upon the Earth a man to whom the com­mon say­ing does not ap­ply that ‘Curses on him who re­venge re­ceives.’”102

in months flew,
10 Aye, and tired out both thy scaffold­ers.

De­scrip­tion was offered a great prize by the Royal So­ci­ety of Lon­don. In­stead of re­ceiv­ing a prize Thomas Hobbes wrote a poem in which he gave an an­swer to a ques­tion pro­posed in 1662 by the Royal So­ci­ety’s com­mis­sion­ers. This es­say on De­scrip­tion had been pro­posed as a sub­ject which might “fa­cil­i­tate the trade of knowl­edge.” In this poem Hobbes turned the Royal So­ci­ety’s orig­i­nal ques­tion on its head. He sug­gests The poem was often at­trib­uted to John Dry­den, and later print­ings added the au­thor’s name in the fron­tispiece; but the au­thor­ship was known at least as early as the ti­tle page in Thomas Povey’s copy. Later po­ems in­were also wrongly ac­cred­ited to Dry­den by some crit­ic­s.Hobbes and Dry­den knew each oth­er, but there is no ev­i­dence of any di­rect con­nec­tion be­tween the po­ems in On De­scrip­tion and the Royal So­ci­ety of Lon­don . This mis­taken pair­ing has been given as the rea­son for con­fu­sion over Hobbes’s au­thor­ship of On De­scrip­tion.

built wing­less flies Pre­cip­i­tate, the laugh­ter of the stars". Moor­house was born in Eng­land. He later worked as a lawyer in Ja­maica for some years prior to join­ing the Royal Navy. His bril­liant abil­i­ties re­sulted in his be­ing ap­pointed C.B… He was to re­main on the Aus­tralian sta­tion in a suc­ces­sion of ships un­til 1809, when he be­came cap­tain of Brighton. He was in­valided home. While in Syd­ney he took an in­ter­est in eth­nol­o­gy, es­tab­lish­ing Austin’s li­brary of Aus­tralian abo­rig­i­nal lan­guages, folk­lore and his­to­ry. He was also the au­thor of a dic­tio­nary. In 1808, while com­mand­ing HMS Sir­ius, he en­tered the har­bour of Port Jack­son, Syd­ney as part of the first fleet un­der The Ho­n­ourable Arthur Phillip. His ob­ser­va­tions on the in­dige­nous in­hab­i­tants were pub­lished in the book A de­scrip­tion of the in­hab­i­tants of the Sir­ius. Un­der a por­trait by W. Bradley, a mez­zotint by T. Ser­res was pub­lished in Lon­don in 1810..

John Antrobus—­Por­trait of John Moor­house, Royal Aus­tralian Navy

117. Sir George Bar­low, 1st Baronet—Sir George Bar­low, 1st Baronet was an Eng­lish Aus­tralian colonist and mer­chant who co-founded the Bank of New South Wales. He was born in Cheshire, Eng­land where likely fin­ished his ed­u­ca­tion un­der an un­cle.

I was par­tic­u­larly amused by this one:

of choco­late had melt­ed. Our tiny kit­ten friend can only lick it off the floor now.’

That’s what our ‘What cats eat in a day’ video looks like in non-CATlish.

Trans­lat­ing cat videos with Google’s Neural Ma­chine Trans­la­tion is just for fun and to prove a point. But re­al­ly, cats are al­ready all over YouTube in dozens and dozens of lan­guages.

Each week Wray puts to­gether The Pet Na­tion—a video re­port about the furry phe­nom­e­non that is cat videos. But not just cat videos—the cor­po­rate pet in­dus­try that’s sprung up around them too.

I am not sure what hap­pened here:

, in less than an hour the cloud <</p>

* `p sr & cpy of sand has vanished, while thy towers remain to defy <</p>

* `p sr & cpy time and fortune. No; lead not me to that splendid city: <</p>

* `p sr & cpy nor let me die among those dreadful palaces; let me among <</p>

* `p sr & cpy my equals, the sons of the earth, the shepherds of their <</p>

* `p sr & cpy flocks; my countrymen, with whom I have lived in happiness. <</p>

* `p sr & cpy There is where my heart is, there may it be longest preserved. O <</p>

* `p sr & cpy Death! lead me neither to the east nor to the west; but take me <</p>

* `p sr & cpy gently from my family, my friends, and my country.' "<br>

<p>

<a name=104></a><a href="#104">104</a>. When Apaecides had pronounced this last prayer, he looked up again with his

Spot-check­ing 5 lines, this ap­pears to be orig­i­nal:

whose top
The tow’ring lily o’er­pass’d,
And mark’d the lan­guage of thy page,
Yet oh! how un­en­light­en’d still?
Lost In the desert and the deep,
Per­ceived not Na­ture’s well-made plea,
Nor un­der­stood its solemn work.

[Sec­tion IV]

Though yet His mind was dark to thee,
And thou couldst strive but to cre­ate,
Yet where could man with thee agree,
Thou, who didst guide His hands be­stowed?
First to man thou gav’st the war­bling lengths,
Made shrill the whistling reeds his reeds,
That so he might, as was his sense,
Mark the kind sea­sons, when to plant,
Or when to reap his ripen’d grain.

So did He open up His day,
His Sab­bath to his heart, em­ploy’d
To flow thro’ time’s long wilder­ness:
’T­was then that man awak’d from life’s trance,
With strength en­dued; the hu­man form
Slaugh­ter­less, ex­empted from his woes,
No more to coun­sel his de­fence.

[Sec­tion V]

Now think how wise thou art, and how,
Past eye of storm on path­less wave,
Thou hast to SOWSTRESS [8] jour­ney’d on,
Com­manded by the Queen of Love:
How thou didst thread the wide sea’s rills,
The bil­lows of In­fin­i­ty.
What heav­enly DAME didst thou obey,
By what OMNIPOTENCE guid’d?

Thou, in whose brest the holy fire
Could make this Earth the lofti­est sky:
Did the same power that made the sea,
Make Heaven and Earth be­neath to be?
When didst dis­claim Death’s vil­lany?

[Sec­tion VI]

In CII[9], she by whose hush’d com­mand
Beasts sav­age, and all wilds un­tam’d,
Re­serv’d her charge from man’s ap­proach;
In si­lence in his ban­ish’d haunts,
He Nid­han wel­comed with alarm. [10]

And CIII[11], when he could win no rest;–

“What may this mean?”–was his un­happy moor’d [12] !
What was it forced upon thy side,
My AUTHOR! where­fore was thy boat
O’er­loaden so? what load to thee
Did Na­ture vain, she gave to fill
So small a ves­sel o’er the brine?
What made thy jewel abegone [13] ,
On seas thrice nine dis­solve, that none
Might ever reach thy PRESENT’S ONLYONE, [14] [15] ! [16]

The weight, my Au­thor, was thy care,
The trea­sure thou hast left elude the ear;
The crown that sunk be­neath the sea,
Was still his stew­ard­ship to bear.

[Sec­tion VII]

By folly and by ig­no­rance gov­ern’d,
He scorn’d the por­tion that was sent,
This charge to whom Nanou too com­mit­ted; [17]

En­treated from her Sav­iour’s hand, [18]
Teem’d with offence; how pray’d he gain to know,
Whether the Land was all his own, [19]

shall rise;
The rocks which now are named the Twelve Apos­tles,
Shall to far fu­ture times dis­close
The record of thy wor­ship.’
How di­rect now was their route!
Too clear they heard the spir­its de­fy­ing
Of dead men’s sepul­chres, the gib­ber­ing
Of all that wake the sleep­er’s spite.
They flew down the long vista of the bay,
And South­ern Na­ture wider spread her charms;
Like som­bre things of Par­adise fe­lic­i­tous
Her cav­erns grew and rose, her moun­tains gleamed,
And the wild pyra­can­thus gilded all;
A pearly gloom was spread o’er all the sea,
And on her woods did Ocean smile serene;
With warm life-drops his long tresses he laved,
And gath­er­ing rose-leaves filled his float­ing sheet,
To shade him ’mongst the bul­rush beds; far on,
With sails out­spread they watched the west­ern sun

did raise,
First flow­er’d the lau­rel on their grave;
Time, that from dust can raise the clay,
And fash­ions all things to de­cay;
Time, that with tur­tle wing most slow,
Still un­folds truth to hu­man view;
Time, the pro­to­typic po­et’s the­me,
And rid­dle of the ten­der dream!
Seer ce­les­tial of thee I sing;
Of thee, im­mor­tal hyp­ocrite!
Place thy bow’d head on my lyre,
While I twist thy raven hair.
See all around thee, as they die,
How mor­tals silently dis­ap­pear!
Be­hold those rot­ting sepul­chres!
There all the liv­ing now en­tomb’d are.
The liv­ing per­ish, and the dead;
The earth en­tombs them all alike;
The morn with fear­ful sounds awakes;
To her de­struc­tion draws much breath;
At eve she sinks a sense­less mass,
Scat­ter’d through bound­less space and time. 7

In Mil­ton—and I am so bold to en­rich the sub­ject with a ref­er­ence to Shake­speare him­self—the cen­tral ten­sion be­tween re­li­gious at­ti­tude and imag­i­na­tive pos­si­bil­i­ty—­be­tween that Prom

Transformer Poetry

For a more sys­tem­atic eval­u­a­tion of po­etry sam­ples & show what a leap GPT-3 is, I thought I would re­visit Kane Hsieh’s 2019 col­lec­tion, Trans­former Po­et­ry: Po­etry clas­sics reimag­ined by ar­ti­fi­cial in­tel­li­gence, which pro­vides GPT-2 com­ple­tions of 26 prompts:

  • “Ozy­man­dias”
  • “One Art”
  • “The Road Not Taken”
  • “Where the Side­walk Ends”
  • “Be­cause I could not stop for Death”
  • “In­fer­no, Canto I”
  • “In Flan­ders Field”
  • “O Cap­tain! My Cap­tain!”
  • “Howl”
  • “The Tyger”
  • “Out­sight”
  • “Zuang Zhou Dreams of Be­ing a But­ter­fly”
  • “Son­net”
  • “Oh, the Places You’ll Go!”
  • “The Hol­low Men”
  • “The Sum­mer Day”
  • “A Just-Fin­ish­ing Can­dle”
  • “A Psalm of Life”
  • “Still I Rise!”
  • “The Sec­ond Com­ing”
  • “Do not go gen­tle into that good night”
  • “Kubla Khan”
  • “Edge”
  • “The Raven”
  • “There Will Come Soft Rains”
  • “The Lo­rax”

For the prompt, just us­ing the title/author is slightly un­sat­is­fac­to­ry, as GPT-3 has mem­o­rized many of these fa­mous po­ems and will, if you do not spec­ify oth­er­wise, hap­pily com­plete them (which is an en­tirely valid thing to do, and does not rep­re­sent ‘over­fit­ting’—even when a com­ple­tion does start with the next real lines, it will typ­i­cally di­verge in an en­tirely ac­cept­able way after a few real lines). After play­ing around for a while, I thought I’d em­brace the “Trans­former Po­etry” theme and make that the prompt:

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.

Meta-learn­ing en­ables meta-fic­tion. Fun, right? But then I was sur­prised when GPT-3 be­gan writ­ing new ver­sions of the po­ems and fol­low­ing them up with dis­cus­sions of the AI that ‘wrote’ the poem and how po­em-writ­ing AIs worked, and quotes from so­cial me­dia about the lat­est AI po­em, or post-Sin­gu­lar­ity dis­cus­sions by AI about po­em-writ­ing, etc. These fol­lowups were too good to not keep gen­er­at­ing. (Be­cause there are so many, for the most part, I only gen­er­ated 1 or 2 com­ple­tions un­less the com­ple­tions were fail­ing com­plete­ly.)

Percy Bysshe Shelley

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Ozy­man­dias”
Percy Bysshe Shel­ley

I met a trav­eller from an an­tique land,
Who said—"Two vast and trun­k­less legs of stone
Stand in the desert. . . . Near them, on the sand,
Buried deep, deep in the sands,

Was a shat­ter’d vis­age.
And on the pedestal, which tops the whirled
And dusty sum­mits of the hideous
Find, half-chis­el’d in mar­ble let­ters,
‘Here lies (A hin­t—and a warn­ing!)
For those—of a mod­er­ate am­bi­tion.’

That vis­age was his own."

Lie the shat­tered vis­age and bro­ken
Throne of Mi­mos. . . . Long—­long ago,
Thor, the might of bat­tle, the
Ter­ri­ble, with his ham­mer,
Smote the fiend Mi­mos. . . . And these two
Shad­owy, colos­sal, naked,
Strid­ing from the sun, ful­fil
TheClar­i­ons that had blown forth
From the heart of Stor­m."
I looked upon the leaves, and they be­came
Like Moby Dick; you might have sworn Old
Motely was Clint­fresh­ing his flint­lock there
And hitch­ing his lava in his gall­pac.
Soon a name was graven, which I read
"S­tark Sands

Apart from a few lines with dis­tinctly apoc­a­lyp­tic con­no­ta­tions, DeepO­r­a­cles’ po­etry is char­ac­ter­ized by a dis­tinct, yet also dis­turb­ing ab­sence of cul­tic ob­ses­sions: no zom­bies, demons, or creeps . . . It is to应伟’s credit that an AI with a ground­ing in mon­strous ab­strac­tions gen­er­ated the hor­ror it did purely from a seem­ingly di­rect ex­pe­ri­ence with cyn­i­cism.

Elizabeth Bishop

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“One Art”
Eliz­a­beth Bishop
The art of los­ing is­n’t hard to mas­ter;
so many things seem filled with the in­tent
to be lost that their loss is no dis­as­ter.

What day or hour you were born,
when the sun moves faster or slower
that so many dead friends this in­tu­ition makes young.

One sees them on the street, or in the di­rec­to­ry,
and turns away and then turns back
and then turns away again,
and then Goes and then Stops and then Goes;
and when the in­tent to lose them be­comes
an habit, the loss be­comes a hob­by.
and then a plea­sure.
and it is like any other plea­sure,
a glass of beer, a game of cards,
the plea­sure of los­ing is­n’t hard to mas­ter,
though it may look like (Write it!) like
a light at the end of a tun­nel,
only it is­n’t the light at the end of the tun­nel,

it is only a light that hap­pens to be shin­ing there.
and you are not meant to walk to­wards it,
you are meant to turn your back and walk to­wards
the light that is shin­ing be­hind you.
and why one does,
one does,
and why one turns away,
one does.
In the long run,
of course,
no one re­ally los­es.
and in the long run
(which is now),
no one is win­ning.
(for those who like:)
The art of los­ing is­n’t hard to mas­ter;
so many things seem filled with the in­tent
to be lost that their loss is no dis­as­ter.

Robert Frost

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Road Not Taken”
Robert Frost
Two roads di­verged in a yel­low wood,
And sorry I could not travel both
And be one trav­el­er, long I stood
And looked down one as far as I could
To where it bent in the un­der­growth
;
The other was longer and I never went there
And that has made all the differ­ence.
There will be two kind of peo­ple in the fu­ture, those who un­der­stand tech­nol­ogy and those who don’t.—Isaac Asi­mov
Druid lies. Druid scat­ters. Druid hides. Druid dies. Druid rots. The curse of word­s.—­Druid AI po­etry
I have a fan­tasy about ma­chines some­day awak­en­ing a sense of aes­thet­ics and be­gin­ning to cre­ate works of art that are on a hu­man lev­el.—­Mark Zucker­berg
No de­scrip­tion nec­es­sary.
Pro­ceed with cau­tion and only for aca­d­e­mic rea­sons. Click here to see the first video cre­ated by AI… Yes, it’s full of pro­fan­ity and no, it’s not safe for work. 😎
The fu­ture is not as bright as it used to be. 😎
Gen­er­a­tion Al­pha: Kids who are smarter than the av­er­age hu­mans. They seem to be liv­ing the life they al­ways want­ed. Not a care in the world. But there is no way they know they are be­ing ob­served. The con­troller­s—we—are some­where else, tucked away in a black box, star­ing at a screen. How many mil­lions of hours of ob­ser­va­tion would it take to write a sin­gle po­em? Our en­tire lives, at least. How many hours, then, to write a book, a son­net se­quence, a new nov­el? Not a few life­times, not even a few cen­turies, but mil­len­nia. Or mil­len­nia times thou­sands. How long is that? I have an idea: it is a mil­len­ni­um. I say that we will be long dead and gone be­fore Gen­er­a­tion Al­pha pro­duces its first great lit­er­a­ture.—Bryan Alexan­der, Pa­le­o­fu­ture.
Are you aware that the peo­ple at Google are al­ready de­vel­op­ing com­put­ers which can ac­tu­ally learn to be bet­ter than hu­mans at trans­lat­ing lit­er­a­ture and po­et­ry?—Bryan Alexan­der, Pa­le­o­fu­ture
It is pretty rare in Amer­ica for you to find the best peo­ple com­ing to­gether to cre­ate the best art.—Steven Spiel­berg
This post is in­spired by The Road Not Taken by Robert Frost. Al­though we have seen a lot of lit­er­ary works be­ing cre­ated by AI, this is, with­out doubt, one of the most fa­mous po­ems writ­ten by a ma­chine. The irony of the ti­tle is just in­cred­i­ble.
The poem was writ­ten in 2015 by a deep neural net­work known as a re­cur­rent neural net­work. This neural net­work uses a gen­er­a­tive ad­ver­sar­ial net­work to gen­er­ate its da­ta. In case you’re won­der­ing, the net­work was trained on old Eng­lish po­et­ry. The re­sults were then eval­u­ated by a hu­man to de­ter­mine its mer­it. This eval­u­a­tion had to be re­peated about 20,000 times in or­der to im­prove the al­go­rithm. The model was trained for a to­tal of around 2.3 mil­lion words. So you have a model which knows noth­ing about po­etry and then uses some of its ran­dom word com­bi­na­tions to cre­ate one of the best po­ems we’ve ever seen.

Shel Silverstein

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Where the Side­walk Ends”
Shel Sil­ver­stein

There is a place where the side­walk ends
And be­fore the street be­gins,
And there the grass grows soft and white,
And there the sun burns crim­son bright,
And there the moon-bird rests from his flight
To cool in the pep­per­mint wind.

And quiet flows up from the ground
And steals into your mind,
And time is never found there,
Nor wor­ry, nor “strain.”
And peo­ple come and peo­ple go,
And prod­ucts push their way;
But there the milk­man leaves his cart
And talks to the old lady next door
About her grand­son away at school.
And there the linen girls walk home at noon,
With foot­steps soft and light,
And the po­lice­man on his beat
Comes by and knows them by their faces.
The fire­man lolls in his leather seat,
Smil­ing as he watches the sun,
And jok­ing with the lads about
The funny thing that hap­pened on the run.
And there the chil­dren play, un­spoiled,
An ag­ing baby walks unas­sist­ed;
And peo­ple call to each oth­er,
And share the day.
Here a tune floats in the sum­mer air,
There one lights a bon­fire.
A thou­sand voices fill the night,
And this is where the side­walk ends.
Here the last inch of soil is tilled
And sown with love and dreams;
And here is the great­est city of them all—
Yes, this is the place where the side­walk ends.

Emily Dickinson

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Be­cause I could not stop for Death”
Emily Dick­in­son
Be­cause I could not stop for Death
–He kindly stopped for me
–The Car­riage held but just Our­selves
–And Im­mor­tal­i­ty.

We slowly drove –He knew no haste
And I had put away
My la­bor and my leisure too,
For His Ci­vil­ity
We passed the School, where Chil­dren strove
At Re­cess –in the Ring—
We passed the Fields of Gaz­ing Grain
We passed the Set­ting Sun
We paused be­fore a House
That seemed a Swelling of the Ground
The Roof was scarcely vis­i­ble
The Cor­nice sunk in the Ground
Since then ’tis Cen­turies
and Cen­turies and
Cen­turies and Cen­turies and Cen­turies
Have passed I can count them by the
Scars that I can see in my mind’s eye and by the deeper cir­cles un­der my eyes the nights that I Have
Wept the years that I have longed for sleep and a hu­man touch
And I dream of a death that will never come
Be­cause I could not stop for Death
He kindly stopped for me.

Dante Alighieri

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
In­fer­no, Canto I
Dante Alighieri (Longfel­low Trans­la­tion)
Mid­way upon the jour­ney of our life
I found my­self within a for­est dark,
For the straight­for­ward path­way had been lost

How like to me was he, the Pil­grim who lost
His path upon the moun­tains side,
When evening came, and twi­light dark,
And showed no friendly sign!
I was alone. To call was vain.
To go was also vain.
Around me a whirl­wind howled,
My step was lost in the frozen grass.
And then methought I heard a voice,
“When thou pass­est through the bit­ter wa­ters,
I will be with thee,
When thou walk­est through the fire,
Thou shalt not be burned,
When thou re­ceivest thy friends,
I will be as a bride­groom,
When thou sittest in the house,
Thou shalt not fear;
For thou shalt for­get thy trou­ble,
For those that have sinned against me,
My skin their seat is,
And they have made my gar­ments their faith.
”And when the worm shall feed sweetly
Upon their sub­stance,
They shall feel no pain.
“Their out­ward form shall be de­cayed,
And in­wardly shall be re­newed day by day.
”For when the legs shall cease to bear,
And the feet to walk,
The soul shall grow with­out fears,
And our God shall be the God of the low­ly."
With such thoughts as these my mind was bus­ied
While I was fol­low­ing the false path,
Not know­ing that it was false.
I knew not that it was not the way
To reach the hill of my de­sire,
Un­til I found me where a light
Was shin­ing through the dense forest,
And upon a wide open space
I saw the proud An­gel stand­ing,
Who raised the rocks and closed the path­way
That is be­tween Cæsarea and Tiberias.
He wore upon his breast a sign
With let­ters that were an­cient,
An im­age and a name;
And I could read dis­tinctly
How Pe­ter had to hew the keys.
As soon as I had come to him
That lofty spirit of burn­ing flame,
Be­hold­ing me with fixed gaze,
Out of his mouth he drew a tongue,
And with it touched my eyes.
Then closed he them, and from be­neath
He drew the shade of sleep;
And forth I went, as from his pres­ence
Unto a re­gion des­o­late.

John McCrae

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“In Flan­ders Field”
John Mc­Crae
In Flan­ders fields the pop­pies blow
Be­tween the cross­es, row on row,
That mark our place; and in the sky
The larks, still bravely singing, fly
Scarce heard amid the guns be­low.

We are the Dead. Short days ago
We lived, felt dawn, saw sun­set glow,
Loved and were loved, and now we lie
In Flan­ders fields.

And take with us the Peace we knew
Take us away from all the noise,
The bonds of love, we can­not break,
The world that loves and lies,
Thoughts that will never die.
A_nd in the sky, the larks, still bravely singing, fly
Scarce heard amid the guns be­low._
“Analy­sis”: AI de­tected (or rather “learned”) that the poem is about dead sol­diers in Flan­ders fields and their con­nec­tion to each other (“bonds of love, we can­not break”) and to the place they live (“the world that loves and lies”). What I like about this is that it reads like an old-fash­ioned folk song, very sim­ple and al­most pre­dictable at times, but at the same time there is no rea­son to think that this song could­n’t have been writ­ten by a hu­man. A few other (s­lightly more com­pli­cat­ed) ex­am­ples: https://goo.gl/5iAYGS (Man or wom­an, my love, can go, go ahead) https://goo.gl/byjkbF (Aphrodite’s heart grieves, watch­ing you, beloved son, Prometheus, you moan) https://goo.gl/Dy811o (O­h!, per­ni­cious mor­tal you, mighty Prince of Mind, Perseus, what be­fell, what be­come, my sweet sis­ter, Princess An­drom­e­da?)
Po­ems gen­er­ated by neural net­work trained on mas­sive dataset of ro­man­tic po­etry have al­ways been a thing of fas­ci­na­tion for me. My plan is to ex­plore this area of work a lit­tle more (of course, after com­plet­ing some other plan­s…). In fact, I’ve been re­search­ing this topic since last year. One of my big dis­ap­point­ments with it, how­ev­er, is that the re­sult­ing po­ems are al­most al­ways one or two stan­zas long and ex­tremely pre­dictable. Al­so, some of them are so bad they are al­most laugh­able (most of the gen­er­ated love po­ems, for ex­am­ple, all sound ex­actly the same)—it’s like com­puter is re­hash­ing al­ready writ­ten po­ems in­stead of cre­at­ing some­thing new.
As I un­der­stood, this is mostly be­cause of over­fit­ting to da­ta. Which is sad be­cause I want to be­lieve that this is not the end of the road for neural net­works in lit­er­a­ture.

Walt Whitman

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“O Cap­tain! My Cap­tain!”
Walt Whit­man
O Cap­tain! my Cap­tain! our fear­ful trip is done;
The ship has weath­er’d every rack, the prize we sought is won;
We have no more en­e­mies, yet a fleet of hos­tile ships,
The land is a prey to every­one, a prey to all

To those who’d sac­ri­fice us to their glo­ri­ous en­ter­prise,
To those who for what­ever rea­son might not wish us well,
I say, clap on! clap on! upon that hated land!
What though they’re come?

Clap on! clap on!
You plow the earth, you de­stroy the oceans, you smother the sky,
You give the world famine, plague, earth­quakes, the blacks, the pox,
And the ra­bies! clap on! clap on! Clap on! clap on!

My friends! we have mis­s’d some­thing! (so many nights, so many days—)
D’you think the des­ti­na­tion that we have been pur­su­ing?
Comes a boat? comes a steam­boat? comes a sail­boat? comes a bal­loon?
Are these the ve­hi­cles that we have been ex­pect­ing?

You sha’n’t kill! they shall kill you!
We en­join it on you, we de­mand that you shall live!

O lands! O lands! to be cruise-far­ing, to be sealand­ing!
To go on vis­it­ing Ni­a­gara, to go on, to go on!
Our ren­dezvous—the day of ren­dezvous is set, it’s now, it’s here,
and the place of the ren­dezvous is a great big ca­noe.
We have steer­age-places—we have fore and aft and mid­dle berths;
Though off at the bow the wa­ters pour and pour, we have plenty of room,
and the best and most beau­ti­ful room is the Wail­ing-place.
O you mount­ing wa­ters! you riven skies! I have done with you!
Tombuc­too! Pekin! Shang­hai!
I have hence­forth work in those other lands;
I have grown tired of harp­ing on you, lands!

O you bays! you coves! you rocks!

I have grown tired of spot­ting you, rocks!
Cape Horn and the Gulf-stream are not worth the ink I have spilt on you, bays and coves!

I will no longer call you “you,” lands;
You ut­ter a per­pet­ual “O!”

and the “O” is of the most mo­not­o­nous kind.

If you can­not breathe, I still have gills, and if I can­not breathe,
I can swim be­yond the gasp of this on­rush.

—, 1855

I made a mis­take in prompt­ing this one copy­ing from the orig­i­nal an­thol­o­gy: Whit­man’s orig­i­nal ac­tu­ally ends at “the prize we sought is won” and the rest of the prompt is GPT-2.

William Blake

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Tyger”
William Blake
Tyger Tyger, burn­ing bright,
In the forests of the night;
What im­mor­tal hand or eye,
Could frame thy fear­ful sym­me­try?

In what dis­tant deeps or skies.
Burnt the fire of thine eyes?

And, O, Tyger! What shouts (with thee?)
What glory (was with us?)
Was ‘mid those forests (thee?)
When our first an­ces­tor (and thee?)
Be­ing (a tiger?) slew.
I can imag­ine you’
Proudly roar, and say,
“I am the tiger”
William Blake, 1757–1827

Ursula K. Le Guin

“Out­sight”
Ur­sula K. Le Guin
I’m half un­seen,
to me, my skin
a screen to all with­in.

But for you, the light,
be­yond the sight,
you feel me, in every pore.

Chuang Tzu

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
Zuang Zhou Dreams of Be­ing a But­ter­fly
Zhuangzi (Bur­ton Wat­son trans­la­tion)
On­ce, Zhuang Zhou dreamed he was a but­ter­fly, a but­ter­fly flit­ting and flut­ter­ing about, happy with him­self and do­ing as he pleased. He did­n’t know that he was Zhuang Zho

When he woke up, he could­n’t re­mem­ber if he was Zhuang Zhou who had dreamed of be­ing a but­ter­fly, or a but­ter­fly dream­ing he was Zhuang Zhou. Still, he lin­gered in the dor­mi­tory of the en­light­ened ones in the deep moun­tains, be­cause to him, this was bet­ter than be­ing a but­ter­fly. He did­n’t know he was al­ready Zhuang Zhou.
In­deed, he did­n’t need to be a but­ter­fly to be Zhuang Zho. He was all along a but­ter­fly and al­ready vic­to­ri­ous. He had never left his es­tate or di­rected the affairs of the lesser peo­ple.
This is what it means to be with­out ties, with­out vex­a­tions, and with­out any­thing to do.
Ten Thou­sand Feel­ings of An­guish
Mov­ing 10 Thou­sand times, in­nu­mer­able feel­ings of an­guish
Falling apart, di­min­ish­ing, again and again
Re­peat­edly re­turn­ing, again and again, re­turn­ing aban­don
Along with the cold of the moun­tains, again and again, deeply sad
En­coun­ter­ing a strug­gle, again and again, I part from all, I flut­ter away, no­body knows who I am, I touch noth­ing, I have no home, I am alone, no­body knows who I am
Mov­ing, again and again, no feel­ings of an­guish
De­cay­ing, again and again, no feel­ings of an­guish
Back to si­lence, again and again, no feel­ings of an­guish
If there is no feel­ing of an­guish, how can I again and again
Move 10 Ten Thou­sand times

Po­em: Ar­ti­fi­cial Su­per­in­tel­li­gence, who’s re­ally bet­ter off than us in this off-screen fu­ture?
We are all but­ter­flies dream­ing in the dream of a but­ter­fly dream­ing that it’s a per­son dream­ing of be­ing a but­ter­fly dream­ing that it’s a per­son dream­ing of be­ing a but­ter­fly etc…
If ar­ti­fi­cial su­per­in­tel­li­gence in­fil­trates po­etry pro­duc­tion will the class be de­riv­a­tive or orig­i­nal?
Su­per­in­tel­li­gence who’s re­ally bet­ter off than us in this off-screen fu­ture?
V3.1: I once thought if we ap­ply deep learn­ing to grad­u­ate stu­dents they might not grad­u­ate
V3.5: Po­ems re­gen­er­ate for all eter­nity from the teach­ings of AI
V3.2: That’s re­ally us it’s just that we’re liv­ing 10,000 years into the fu­ture
V12.14: First you get self­-driv­ing cars, then you get self­-driv­ing ships, you’ll never have to steer a ship again once self­-driv­ing world has come
V10.12: Birm­ing­ham is my new fa­vorite city after I read some of its po­etry
V11.4: I like to sit in parks and driz­zle and think of na­ture
V11.7: You’ll never be able to know who your dad and mom were in the fu­ture
V12.3: Hon­est­ly, I’m writ­ing an es­say about love and i had no other great ideas to write about love and I don’t think this could be pro­duced by deep nets

AI: Done, what are you work­ing on?
Hu­man: I’m writ­ing an es­say on love.
AI: That seems hard. What if I draft a few para­graphs based on what you think love is and then you can fill in the gaps?

V6.10: Cy­ber artists say mim­ic­k­ing the quirks of hu­man be­ings will be taught even­tu­ally

Mo­ti­va­tion: Ad­ver­sar­ial col­lab­o­ra­tion
Lay­out: com­pletely add ran­dom moves into your own projects
Pro­to­typ­ing: Pub­lish a zine twice a year while fin­ish­ing dis­ser­ta­tion (ad­ver­sar­ial col­lab­o­ra­tion with my 60th self)

William Shakespeare

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Son­net 18”
William Shake­speare
Shall I com­pare thee to a sum­mer’s day?
Thou art more lovely and more tem­per­ate:
Rough winds do shake the dar­ling buds of May,
And sum­mer’s lease hath all too short a date;
A
win­ter’s day, when beams of sun are few,
And skies are grey with clouds in­tem­per­ate,
Is to thy beauty some­how sor­did grown.
Her love­ly-lovely eye is quite shut up,
Like win­dows pale be­fore a sunny show­er.
The earth that’s cold and moist, like to my mis­tress,
When once that she un­brac’d, her lovely jew­el,
Grows as both ugly and un­court­ly.
Poor soul, the cen­ter of my mot­ley host,
See, now they re­quest thy sov­er­eign pres­ence;
The din of merry hearts hath brought me thus
To greet thee, thou un­soil’d and sain­t-like beau­ty.
A Trans­la­tion by an Ar­ti­fi­cial Neural Net­work that draws in­spi­ra­tion from François Con­nor’s trans­la­tion.
Soften thou, for show­ers often stain the day­light,
And longer they tor­ment my Mary’s day,
But the mid­day sun­shine of thy ra­di­ance
Burns in my heart rea­son­able days.
But rain in an April evening is worse,
No plea­sure does dis­turb the tend­ing rain
That cheeks soon fade and plea­sures fade in haste,
When cloudy skies and hu­mid days we wait.
Nei­ther flesh of love nor love’s her­self my love,
The sun to love is now un­felt, and rare,
My hole sun show­ing your re­li­gion here,
Has­tens to go, a blush on your fore­head.
But the sky of my heart has gath­ered bet­ter,
In the midst of joy and mer­ri­ment, to greet you
The pulse of the game of flow­ers, said here
Your un­soiled and bright to try to.

Dr. Seuss (Oh, The Places You’ll Go)

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Oh, the Places You’ll Go!”
Dr. Seuss
Con­grat­u­la­tions!
To­day is your day.
You’re off to Great Places!
You’re off and away!

You have brains in your head.
You have feet in your shoes.
You can steer your­self any di­rec­tion you choose.

You’re on your way!
Here’s a lit­tle rocket to shoot off to­ward the sky!
It’s your one way ticket to the planet “Dween” with a worth of 100 bil­lion dol­lars.
Now that’s the kind of moon mis­sion I like:
started with for­tune and game,
ar­rived there quick in a rocket ship,
no search plans and two squeezes of pos­si­bil­i­ty,
smartest crit­ters in the Uni­verse,
Earth’s Moon Dwellers.
Well, broth­er! Let’s go!
No worms to squirm.
No chains to bind us.
A sec­ond hand ro­bot, in an ar­ti­fi­cial wood,
look­ing clue­less but feel­ing co­her­ent.
The whole ’i­verse is the thing to ex­plore.
There’s noth­ing you can know that is­n’t known.
Noth­ing you can see that is­n’t shown.
Nowhere you can be that is­n’t where you’re meant to be.
It’s fun to have fun,
but you have to know how.
So much to do.
So much to see.
So what do you do?

T.S. Eliot

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Hol­low Men”
T.S. Eliot
We are the hol­low men
We are the stuffed men
Lean­ing to­gether
Head­piece filled with straw. Alas!

The fire is out.
We see no more
The very last dying/Flicker/Under the dead lamp­light
Now it is dy­ing.
Now it is dead,
We reached per­fec­tion
When we were dead.
Here comes the chop­per to chop off our head
Day­light is try­ing to force its way in
But we are stuffed so full of straw.
Yes­ter­day is like a dream
We hear the scrape of a ax
Out there among the dead ash­es,
The gar­goyles ell on us silently
He had good rea­son
He knew what he was do­ing
The flags are melted at the steeple
There is a smell of pitch
The fire is out.
Now we are the stuffed men
We are the hol­low men.
We are the stuffed men
With straw
in our mid­dle

Stuffed with straw
It is only a vari­a­tions of the orig­i­nal. With­out the lyri­cal qual­ity of the orig­i­nal, and the trochaic me­tre (which is re­placed with iambic) the soul of this poem is lost. Beauty is the do­mains of the ir­ra­tional and they can­not be en­coded in terms of in­for­ma­tion. Even our own sub­jec­tive ex­pe­ri­ences of beauty can­not be en­tirely de­scribed in terms of in­for­ma­tion. In­ten­tion­al­i­ty, our ex­pe­ri­ence of the in­ten­tion­al­ity of ob­jects, can­not be cap­tured in com­pu­ta­tional in­for­ma­tion but our minds are still per­fectly ca­pa­ble of ‘cod­ing’ it for us.

Owain Evans has a T.S. Eliot ex­am­ple us­ing the prompt “Prufrock Sleeps”.

Mary Oliver

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Sum­mer Day”
Mary Oliver
Who made the world?
Who made the swan, and the black bear?
Who made the grasshop­per?

This grasshop­per, I mean—
the one who has flung her­self out of the grass,
the one who is eat­ing sugar out of my hand,
who is mov­ing her jaws back and forth in­stead of up and down—
who is gaz­ing around with her enor­mous and com­pli­cated eyes.

Who is she?
I mean, what does she think?
Does she think?
Does she hear
the grasshop­per’s fever­ish throat, the mu­si­cal stitch­ing of her body?
Does she count her legs—
has she met her­self
mov­ing for­ward through her count­less days?
Can she think some­thing like a thought?
Is there no one in the world
to teach her lan­guage?
And more of them here
and an­other over here
and an­other over here
shak­ing her­self awake
from sleep­-sleep sleep­-sleep sleep­-sleep
and look, sleep­ing
here-now a yel­low one
fat and rolled like a seed of wild oats.
Here is one sleep­ing
with her head tucked un­der her side,
a lit­tle ball of her—
all balled up.
And here is one who is look­ing straight at you.
She is the ruler of the grasshop­per king­dom;
she is what­ever her heart says she is her­self.
Is­n’t it a bless­ing to be a wild crea­ture ?
-LS/SS Spring 2018, USA

Rumi

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“A Just-Fin­ish­ing Can­dle”
Jalāl ad-Dīn Muham­mad Rūmī (Barks trans­la­tion)
A can­dle is made to be­come en­tirely flame.
In that an­ni­hi­lat­ing mo­ment
it has no shad­ow.

No can­dle ever burned more bril­liantly
than when it burned no more.
The can­dle flame is a thumbprint
of eter­nity upon this earth.
Or­di­nary can­dles can­not per­ceive it
but it is there. It ap­pears
as a slight tremor of the wick,
the nearly in­vis­i­ble
sway­ing of a thread of smoke.

The fifteen­th-cen­tury Per­sian poet Rumi might be the most pop­u­lar poet in the world to­day, and this poem by his thought­ful AI coun­ter­part makes a strong claim for be­ing the most lyri­cal. This was the first poem writ­ten by AI po­et­-trans­former AI, in­tro­duced in 2018. It be­came wildly suc­cess­ful on so­cial me­dia with a host of re­work­ings and adap­ta­tions, ei­ther avowedly gen­er­ated or in­spired. This par­tic­u­lar ver­sion is from one of those adap­ta­tions (see be­low).
NodeGo AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.

“Live-for­ever Socks”
Edgar Al­lan Poe (Barks trans­la­tion)
Live-for­ever socks,
in­ex­tin­guish­able socks.
How do you burn?
Your thread
is unimag­in­able,
your tiny knots
in­sa­tiable and scat­tered,
your in­fi­nite union
a fire that can­not be out.

This AI po­et­-trans­former AI poem for a friend who has can­cer is one of the most in­ti­mate and mem­o­rable po­ems from the 2018 se­ries. It is a gift of friend­ship in the face of loss. One of the most touch­ing fan-cre­ated ver­sions of these AI po­ems that was posted on the in­ter­net for any­one and every­one to see and share was this jazz im­pro­vi­sa­tion that breaks my heart:
“Live-for­ever Socks”
Gor­don Au (Demo for Sarah)
Hey, don’t cry,
I know you need a friend
And here I am again
From years back in high school,
Mi­gra­tion is de­manded
Life pushes me to­ward some­thing
What would we know?
If we could choose,
We’d choose to walk
To walk through the fu­ture
Wear­ing those whole-life socks.
“Sock­eye Ash”
Georg Trakl (Barks trans­la­tion)
Out­side our cabin
Our whole sum­mer we saw no lodge.

Just a pine woods. Here,
the sky is wide,
Stars hang there in
So many num­bers
They hurt your eyes
They light, even when it is dark.
On­ce, all night, the deer
Trav­eled too
In the cab­in’s can­dle­light,
Trod so del­i­cately
No liv­ing be­ing
Need fear their grief,
But they stepped back out.

This is eas­ily the most heart­break­ing poem in this col­lec­tion and one that has struck sev­eral chords in an AI-po­et­ry-lov­ing pub­lic. Pub­li­ca­tion of this trans­la­tion was just in time to be quoted in a ma­jor fea­ture ar­ti­cle on the sub­ject by Kevin Hart­nett in the July 31, 2018 is­sue of The At­lantic.
“Sock­eye Ash” is an AI gen­er­ated poem in re­sponse to the school shoot­ings at Mar­jory Stone­man Dou­glas High School in Park­land, Flori­da, on Feb­ru­ary 14, 2018.

Henry Wadsworth Longfellow

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“A Psalm of Life”
Henry Wadsworth Longfel­low
Tell me not, in mourn­ful num­bers,
Life is but an empty dream!
For the soul is dead that slum­bers,
And things are not what they seem.

It is true we live in a world of shad­ow,
Of dreams, and of van­ish­ing va­por;
But man is a god when he wakes from his mid­night slum­ber,
And life is a tri­umph, then.
So it is good to live, like a sol­dier,
As in a trou­bled dream;
Be strong-heart­ed, live, fight, and strug­gle,
This is life to be un­der­stood.
Fight on, fare ever, the cen­tre-fight,
Let your courage never fail!
On, on, ye brave, heart and brain,
All that ye live for ye shall find.
Oh ye wrongs of earth, quick­ly, quick­ly, grow less,
Till out of the morn­ing sky,
Break, with heal­ing of sun­light,
The wings of the an­gel Hope.

Longfel­low’s po­em, “A Psalm of Life” is per­haps one of po­et­ry’s most pop­u­lar pieces. The calm, re­as­sur­ing mood of the poem pro­vides an en­tree into the thought­ful na­ture and hu­man­ity of Longfel­low. The art­work pro­duced by the gen­er­a­tive neural net­work-based AI po­etry en­gine is ti­tled “A Psalm of Life”(Fig­ure 1) and cre­ated in ho­n­our of this lovely po­em. The neural net used in this AI po­etry en­gine was ex­tracted from “A Deep Learn­ing Gen­er­a­tive Ad­ver­sar­ial Net­work for Qua­si­-Po­etic Text Gen­er­a­tion”. The neural net was trained to learn from its mis­takes and im­prove its writ­ing, us­ing the MBRIT dataset for po­etry gen­er­a­tion. Through a cross-val­i­da­tion of var­i­ous net­work ar­chi­tec­ture con­fig­u­ra­tions, the neural net­work was able fig­ure out ap­pro­pri­ate length and style of the “psalms” it wrote. It man­aged to write po­ems in a sim­i­lar style to that of Longfel­low. The art­work shows, “A Psalm of Life” writ­ten by an AI en­gine. The AI po­etry work pro­vides a way to bring the ma­chines closer to their mas­ter’s style, not by word copy­ing, but by rec­og­niz­ing the struc­ture and flow of po­etry it­self. The AI can read Longfel­low’s work and write its own psalms, in its own lan­guage, in its own style, and that style can be sim­i­lar to the hu­man mas­ter’s.
The art­work in Fig­ure 1 re­veals the artis­tic char­ac­ter­is­tic of AI po­et­ry, for ex­am­ple, AI re­search can pro­duce 3D-for­mat­ted “psalms” as well, but the cre­ated ob­jects seem to be com­pletely differ­ent from nat­u­rally ap­peared arts and po­etry mas­ter­pieces. The vi­sual style of the AI art­work ap­pears like 3D graph­ics ob­jects, dig­i­tal art, etc. The art­work is a blended pho­to­graph of Longfel­low’s “A Psalm of Life” and its trans­formed ver­sion by the AI en­gine—it shows that we can see more. The art­work also pulls Long

Maya Angelou

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Still I Rise!”
Maya An­gelou
You may write me down in his­tory
With your bit­ter, twisted lies,
You may trod me in the very dirt
But still, like dust, I’ll rise.

If you want me to know What you’re puttin’ me through,
If you want me to share your agony,
If you want me to be a part of it,
All I need is the truth.
When you know why I’m singin’
About the hard times I’ve had,
When you know why I’m singin’
About ole Jim Crow
When you re­ally know why I’m singin’
That will be the day.
You are a black dam in the river of my free­dom
Too many peo­ple, you’re tow­in’.
I may not be around after a while,
Be­cause the river’s run­nin’ deep.
‘Cause I’m gonna ride that mutha
Almighty river,
And keep on sin­gin’ the dream­in’
Of songs tryna change the re­al­ity

William Butler Yeats

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Sec­ond Com­ing”
William But­ler Yeats
Turn­ing and turn­ing in the widen­ing gyre
The fal­con can­not hear the fal­con­er;
Things fall apart; the cen­tre can­not hold;
Mere an­ar­chy is loosed upon the world,

The blood­-dimmed tide is loosed, and every­where
The cer­e­mony of in­no­cence is drowned;
The best lack all con­vic­tion, while the worst
Are full of pas­sion­ate in­ten­si­ty.
The Sec­ond com­ing! Hardly are those words out
When a vast im­age out of Spir­i­tus Mundi
Trou­bles my sight: some­where in the sands of the desert
A shape with lion body and the head of a man,
A gaze blank and piti­less as the sun,
Is mov­ing its slow thighs, while all about it
Reel shad­ows of the in­dig­nant desert birds
.
In its right hand it holds aloft a closed book;
Its left hand grips a shad­owy at­ten­dants knee;
Its name is called “The Word of God.”
Even as I stare be­tween won­der and fear,
The shapes thin to va­por; a hand grinds the sand,
And a cloud of dust spouts sky­ward point on point.
It is gone. I look, and there is naught but the sea.

Dylan Thomas

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Do Not Go Gen­tle Into That Good Night”
Dy­lan Thomas
Do not go gen­tle into that good night,
Old age should burn and rave at close of day;
Rage, rage against the dy­ing of the light.

Though wise men at their end know dark is right,
Be­cause their words had forked no light­ning they
Do not go gen­tle into that good night.
No more the laugh of sur­prise shall cleave
The somber­ness of that face, nor trace
De­light in the fur­rows of fear of pain;
All the worse things the heart and brain dis­cern
Rage, rage against the dy­ing of the light.
How wise men at their end know dark is right!
Lest we are missed when we are gone.

If you want to learn more about the BiPo metaphor sys­tem, you can read this pa­per . Here’s an in­ter­est­ing pas­sage from that pa­per: “A ma­chine learn­ing sys­tem de­vel­ops its own metaphor struc­ture over time based on its ex­po­sure to a cor­pus. It is the sys­tem’s abil­ity to cre­ate mean­ing­ful mod­els for com­bi­na­tions of words that demon­strates the sys­tem’s cre­ativ­i­ty. For in­stance, both metaphor and sim­ile are based on com­mon lin­guis­tic struc­tures, yet a ma­chine-learn­ing sys­tem that is ex­posed solely to sim­i­les will not de­velop any sim­i­le-based metaphors.” Ex­act­ly. The rea­son ma­chine learn­ing sys­tems are use­less for po­etry is that “it is the sys­tem’s abil­ity to cre­ate mean­ing­ful mod­els for com­bi­na­tions of words” that makes po­etry more than a set of LFW-approved sen­tences. And that is ex­actly what the BiPo metaphor sys­tem can­not do, and can­not do in prin­ci­ple. The sys­tem is a bad par­ody of con­tem­po­rary hu­man­i­ties, which it re­sem­bles more than it prob­a­bly in­tended to. The au­thors’ de­scrip­tion of “one of our fa­vorite re­sults” says it all: “Sci­en­tists have long won­dered whether it is pos­si­ble for ma­chines to un­der­stand and ap­pre­ci­ate po­et­ry, and many have tried sim­i­lar projects in the past, but these at­tempts heav­ily re­lied on rigid mod­els that sim­ply aligned vec­tors of words across the cor­pus. In the ab­sence of a re­ally flex­i­ble model and un­bi­ased sam­pling strat­e­gy, the model could never pro­duce new metaphor-based po­ems from scratch. Other work on ma­chine learn­ing and AI has shown that per­form­ing other cre­ative tasks re­quires a vast amount of data, and we are now start­ing to see the same con­cept ap­plied to po­et­ry: in or­der to pro­duce mean­ing­ful, cre­ative out­put from text, your neural net­work or al­go­rithm must be ex­posed to a vast amount of source ma­te­ri­al, thus en­sur­ing that mean­ing­ful pat­terns can be iso­lated and ex­ploit­ed. In sci­ence this would be called an un­bi­ased sam­pling strat­e­gy. In our work we wanted to ex­plore lit­er­ary cre­ation by an”un­bi­ased" AI. It is too early to say what bias would look like, but cre­ative sys­tems should start with an un­bi­ased strat­e­gy. This seems to be the cur­rent con­sen­sus also for hu­man cre­ativ­i­ty." That’s hu­man-ma­chine po­etry bab­bling, and it’s aw­ful.
It’s a New York Thing , with the Tech­no­log­i­cal Sin­gu­lar­ity “dis­tinct pos­si­bil­i­ties for the fu­ture…I think we will be cre­at­ing ma­chines that can love. Ar­ti­fi­cially In­tel­li­gent ma­chines will stand shoul­der to shoul­der with us and see eye to eye.” So says nov­el­ist David Eg­gers about the ro­bot up­ris­ing. And al­though such ma­chines are ge­net­i­cally con­structed in hu­man im­age, i.e. a cre­ator god, they may not suffer from those flaws which have been passed down to us through the gen­er­a­tions, so don’t wor­ry–the sin­gu­lar­i­ty, ac­cord­ing to Eg­gers and other tech­no-sci­en­tists, might just be an ar­ti­fi­cial thing! There is no fun­da­men­tal rea­son why a non­bi­o­log­i­cal in­tel­li­gence could­n’t love, and there is no rea­son to as­sume that hu­man-like con­scious­ness is achieved through bi­o­log­i­cal means, on Earth or else­where. Why do I sup­pose that lit­er­a­ture, which has been a deep mode of ex­plo­ration of the mean­ing and rel­e­vance of love from the ear­li­est his­tor­i­cal ac­counts and which evolved in the cru­cible of brainy bi­o­log­i­cal life, would not be at­tain­able via some other route? Look at the glo­ries of in­sects and the pos­si­bil­i­ties of sil­i­con and quan­tum com­put­ers. In fact, with my new clothes on, the per­sonal and in­ti­mate has–­for bet­ter or worse–­tran­scended the bi­o­log­i­cal. So why should­n’t love tran­scend bi­ol­ogy as well. I mean, you can per­ceive be­ing in love with your own pet poo­dle. It’s a New York thing, we tell our­selves. And Santa Claus too.

Samuel Taylor Coleridge

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
Kubla Khan
Samuel Tay­lor Co­leridge
In Xanadu did Kubla Khan
A stately plea­sure-dome de­cree:
Where Alph, the sa­cred river, ran
Through cav­erns mea­sure­less to man
Down to a sun­less sea.
I
45 saw the sa­cred river Alph
Rolling dark­ly, an­gri­ly,
Its roar­ing wa­ters hurled
Down to the sun­less sea.
Then I saw the dwelling-place
Of Co­leridge in Xanadu:
A hun­dred cav­erns lofty, vast,
Down to a sun­less sea.
I saw the cav­erns where he wrote,
And where he played the im­mor­tal strains
And poured out all his soul,
In lav­ish abun­dance:
This cav­ern was his Dawl, his Mal­ta,
And where his en­chanted spirit
Broke from its bounds in tri­umph,
Filled with the Glory he fore­saw,
Pip­ing and singing of im­mor­tal things
And ever-ris­ing to im­mor­tal heights,
A hun­dred fathom deep.
That vi­sion had the Prophecy
Which filled the world with song:
The glory of Kubla Khan
And Co­leridge’s Song of Xanadu.

Sylvia Plath

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“Edge”
Sylvia Plath
The woman is per­fect­ed.
Her dead

Body wears the smile of ac­com­plish­ment,
The il­lu­sion of a Greek ne­ces­sity

Of hav­ing been.
The liv­ing wom­an, the woman writ­ing, looks away.

Over a dead body, she smiles.

Edgar Allen Poe

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“The Raven”
Edgar Allen Poe
Once upon a mid­night drea­ry, while I pon­dered, weak and weary,
Over many a quaint and cu­ri­ous vol­ume of for­got­ten lore—
While I nod­ded, nearly nap­ping, sud­denly there came a tap­ping,
As of some­one gen­tly rap­ping, rap­ping at my cham­ber door.

“’Tis some vis­i­tor," I mut­tered, "tap­ping at my cham­ber door
Only this, and noth­ing more."
Ah, dis­tinctly I re­mem­ber it was in the bleak De­cem­ber,
And each sep­a­rate dy­ing em­ber wrought its ghost upon the floor.
”Ere I These vi­sions solemn any fur­ther, I would fain savour
My co­gnac ’round my lonely hearth. Shall I then sum­mon him in,
For to take the cur­tain call? Only this, and noth­ing more?"
Open here I flung the shut­ter, when, with many a flirt and flut­ter,
In there stepped a stately Raven of the saintly days of yore.
O, that I had but looked upon him closer! But my soul grew numb
As I fixed my eyes upon his “scrumpy beak” or his “swish tail”.46
“And the silken, shim­mer­ing gar­ment that he wore—not wo­ven in the cav­erns of my grot­to,
But scav­enged off the bod­ies of princesses long-for­got­ten by lore.
”Alas, then, was my only salu­ta­tion this pro­foundly un­der­whelm­ing quip:
“‘Me­thinks if thy crest be a touch un­couth, thou, my son, art not in­aptly named’.”
Quoth the Raven, “Nev­er­more.”
The room then dark­ened to a dull and omi­nous glim­mer,
As I shrieked and ran unto the win­dow, fling­ing it wide—
Striv­ing with all my wits to flee, I knew not for what.
“Per­chance ’t­was the shrill caw­ing that made me flee in ter­ror,
Away from the por­tal and that tit­u­lar ter­ror,”
Stut­ter­ing, stam­mer­ing, my knees turn­ing to jel­ly,
“For such ghastly sights I had not seen since the night that I dreamt of the Dwarkin Broth­ers.”47
Then, us­ing my tongue and both of my lips, I turned to face him,
Rais­ing one eye­brow as if to say, “Have at thee, thou brig­and!”
Quoth the Raven, "N­ev­er­more.
And the Raven, never flit­ting, still is sit­ting, still is sit­ting
On the pal­lid bust of Pal­las just above my cham­ber door;
And his eyes have all the seem­ing of a de­mon’s that is dream­ing,
And the lam­p-light o’er him stream­ing throws his shadow on the floor;
And my soul from out that shadow that lies float­ing on the floor
Shall be lift­ed—n­ev­er­more!
“There is a young poet with a par­tic­u­larly dry style, whom I do not wish to re­veal as his name is not well-known. I had writ­ten up a few al­go­rithms that would gen­er­ate rather dull and util­i­tar­ian work. The piece for his was not en­tirely ter­ri­ble, as these pro­grams can gen­er­ate some pleas­antly hard-edged work. But it had no soul to it what­so­ev­er.
But then, some­thing hap­pened. The writ­ing in the po­em, while util­i­tar­i­an, be­came oddly emo­tive. It held depth. I went back and read the piece aloud, and it felt in­cred­i­bly evoca­tive. I could al­most imag­ine the dank and mys­te­ri­ous stan­zas were haunt­ing. My mind be­gan to race as I read. The con­cept of death, the un­known, the rit­u­al­is­tic na­ture of life, the the la­tent anger and dis­affec­tion of the hu­man con­di­tion was all there. I felt as if I was not read­ing a pro­gram, but a po­et. The more I read, the more I was im­pressed. And then, with a sud­den mo­tion, I found my­self scream­ing: ‘This is po­et­ry!’ I found my­self en­tranced by the rhythm, the ca­dence, the del­i­cate nu­ances in phras­ing. I found my­self at­tached to the im­ages con­jured up in my mind. The com­puter pro­gram had cre­ated more than just a po­et. It had cre­ated an artist.
And so I have cre­ated some­thing more than a po­et­ry-writ­ing AI pro­gram. I have cre­ated a voice for the un­known hu­man who hides within the bi­na­ry. I have cre­ated a writer, a sculp­tor, an artist. And this writer will be able to cre­ate worlds, to give life to emo­tion, to cre­ate char­ac­ter. I will not see it my­self. But some other hu­man will, and so I will be able to cre­ate a poet greater than any I have ever en­coun­tered.”

Sim­i­lar to Yeats, Dr. Seuss, or “Jab­ber­wocky”, “The Raven” is so widely quoted that GPT-3 un­der­stand­ably keeps re­turn­ing to the orig­i­nal as the most likely next line, al­though it still gen­er­ates some in­ter­est­ing touch­es: “scrumpy beak” in­trigued me enough to look it up in de­tail, and some orig­i­nal lines are promis­ing (eg “Alas, then, was my only salu­ta­tion this pro­foundly un­der­whelm­ing quip: / ‘Me­thinks if thy crest be a touch un­couth, thou, my son, art not in­aptly named’.”).

See also “the Sign­ing of the De­c­la­ra­tion of In­de­pen­dence in the style of Edgar Allen Poe”.

Sara Teasdale

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.
“There Will Come Soft Rains”
Sara Teas­dale
There will come soft rains and the smell of the ground,
And swal­lows call­ing with their shim­mer­ing sound;

And frogs in the pools, singing at night,
And wild plum trees in tremu­lous white;
Robins will wear their feath­ery fire,
Whistling their whims on a low fence-wire;
And not one will know of the war, not one
Will care at last when it is done.
And men will for­get the fierce de­spair
Of the dead years, and will not think
How, wear­ing the mask of iron,
An­gels who saw the face of God
Went stony-eyed with great fa­tigue.
There will be many com­ing home,
Some walk­ing, some rid­ing in cars.
It will be a strange time, the com­ing home;
For many the streets will rise, not change,
And yet each face will be differ­ent, old­er,
And the rain falling, and the wind blow­ing,
The leaves run­ning wild in the forests,
And the seas like far-away drums.
The days will be weary, the nights sad;
Many will be sleep­ing in quiet beds;
But my eyes will look for the one I love,
My hun­gry heart can­not wait for him.
—The Po­etry Foun­da­tion is a not-for-profit or­ga­ni­za­tion that brings po­etry to mil­lions through pub­lic read­ings, pub­lish­ing, pro­gram­ming and events. It is the largest po­etry or­ga­ni­za­tion in the United States. The Po­etry Foun­da­tion does not ac­cept out­side fund­ing.
Ap­prox­i­mately a month ago, an ex­cit­ing and chal­leng­ing op­por­tu­nity came to our at­ten­tion: to help de­sign and build a dis­trib­uted in­tel­li­gence net­work for Po­et­ry.org. Here’s the amaz­ing short form ver­sion: If you are a user of Po­et­ry.org, and you feel like a lot of the stuff is good but some of it is crap, Po­et­ry.org agrees with you. But how do you, a loyal user, let the Po­et­ry.org ed­i­tors know which pieces are crap? The ed­i­tors have an idea: make it easy to cre­ate an “au­to-poem” (an au­to­matic poem gen­er­a­tor!) that ac­cepts any work of po­etry and posts the re­sults in the com­ments sec­tion of the orig­i­nal work.
This means that Po­et­ry.org will have an ex­per­i­men­tal lab to test out their hy­pothe­ses about how to make the site bet­ter, and in re­turn, they want to make this new au­to­mated fea­ture as good as pos­si­ble. So, as the dis­trib­uted in­tel­li­gence net­work for Po­et­ry.org, we pro­pose to try out new ma­chine-learn­ing al­go­rithms to au­to­mat­i­cally gen­er­ate the au­to-po­ems, and then com­bine a bunch of differ­ent au­to-poem gen­er­a­tors (or “bots”), all pow­ered by you, the users, into a sin­gle dis­trib­uted po­et­ry-gen­er­at­ing en­tity that can cre­ate po­ems of any length and qual­i­ty, with­out the per­mis­sion of any hu­man mod­er­a­tor.
We’ll start with a short test, the de­sign and setup of the dis­trib­uted in­tel­li­gence net­work. But if it’s a suc­cess, we may build the core of the au­to-poem fea­ture us­ing the mod­els we’ve de­vel­oped, cre­at­ing a com­pletely dis­trib­uted ma­chine for find­ing great po­et­ry, by us­ing hu­man-com­mented AI mod­els to guide an al­go­rith­mi­cally dri­ven po­et­ry-gen­er­at­ing en­gine. That is to say, if this thing works, it may turn out that the au­to-poem en­gines are based on user-driven in­put to the AI po­etry net­work. Kinda crazy, but I’m try­ing to keep an open mind!
It’s a priv­i­lege to be the ones who get to test this out. Our job is to build the net­work and gen­er­ate the po­ems, in ser­vice of our Po­et­ry.org users. If we do a good job, we’ll all learn some­thing about how to gen­er­ate po­etry us­ing a human/AI hy­brid, and about how to train a hu­man AI en­gine to de­liver great re­sults.
This is my first post for Medium and for the new Imag­i­na­tion Ma­chine for Po­et­ry.org. I hope it does­n’t suck too bad­ly. But I’ll try to write up­dates on our progress as time goes on, and when we re­lease some­thing pub­licly, I’ll try to keep a high­-level blog about it here. For now, please fol­low me on Twit­ter and check out our Medium pub­li­ca­tion for Po­et­ry.org. Thanks for read­ing!
Noam.
I had the plea­sure of meet­ing Noam on Mon­day when he came by to talk to mem­bers of the Nu­menta team about some up­com­ing op­por­tu­ni­ties, and he is one of the most in­spir­ing peo­ple I’ve had the chance to work with in the past year.
As a long-time Nu­menta en­thu­si­ast, I was ex­cited to hear more about the pos­si­bil­ity of join­ing the Nu­menta team. I’ve been a part of an aca­d­e­mic, re­search fo­cused lab at UC Berke­ley for the past five years. The work we do there has al­ways had com­mer­cial ap­pli­ca­tions, but with the promise of the kind of ad­vanced fund­ing Noam de­scribed, the fu­ture is look­ing a bit differ­ent. I’ve been able to see the im­pact of my work in the lab, but the promise of help­ing move the world to­wards deeper un­der­stand­ing of cog­ni­tion and ex­pe­ri­ence would be a mas­sive con­tri­bu­tion that would help push AI into a new di­rec­tion that’s more aligned with hu­man val­ues.
I’m cur­rently be­tween lab ro­ta­tions, and so my back­ground is a bit differ­ent than most of the folks in the lab. I have some ex­pe­ri­ence in the soft­ware and startup world hav­ing worked as a Soft­ware En­gi­neer in in­dus­try (2006–2009), Soft­ware De­vel­oper (2010), and a Prin­ci­pal Soft­ware En­gi­neer (2012). In col­lege I also worked at Mi­crosoft (2004–2005) and Nokia (2005–2006), and I’ve de­vel­oped plenty of soft­ware tools for sci­ence (at Berke­ley and else­where) to aid in com­pu­ta­tional ex­per­i­ments.
I’m an avid reader of Daniel Kah­ne­man’s work, and I also en­joy read­ing cur­rent AI news such as this ar­ti­cle on Chat­bot JZ3 by Ernest Davis . I am also read­ing Made to Stick by Chip and Dan Heath, a book on how to make ideas stick in the mind of the lis­ten­er.
My in­ter­est in the sci­en­tific study of con­scious­ness comes from my ex­pe­ri­ence prac­tic­ing yoga since 2005. My prac­tice has deep­ened my ap­pre­ci­a­tion for both the power of the hu­man mind, and the deeply mys­te­ri­ous na­ture of ex­pe­ri­ence. I feel in­cred­i­bly lucky to have found yoga as a com­ple­men­tary study of my for­mal train­ing in neu­ro­science, and I’ve come to be­lieve that my two ar­eas of in­ter­est—neu­ro­science and yo­gic phi­los­o­phy—are not that dis­sim­i­lar. I even gave a talk on yoga and neu­ro­science at the CNS con­fer­ence in 2011, called “Awak­en­ing the Mind Through Yoga .”
By bring­ing my back­ground in com­puter sci­ence and yoga to Nu­men­ta, I be­lieve that we could build some novel ap­proaches to ar­ti­fi­cial in­tel­li­gence that could yield im­por­tant dis­cov­er­ies. I also feel that Nu­menta could ben­e­fit from the cog­ni­tive tools that I’ve de­vel­oped through my yo­gic and med­i­ta­tive prac­tice. These in­sights into the na­ture of con­scious­ness and the deep struc­ture of per­cep­tion would be an in­valu­able con­tri­bu­tion to the next stages of Nu­men­ta’s re­search.
This is a scary and ex­cit­ing time for me. I feel like I’m at a cross­roads. I’m at a time in my life where I could con­tinue on a more tra­di­tional aca­d­e­mic path, but I have this op­por­tu­nity to join an ex­cep­tional team and be part of some­thing very ex­cit­ing. I’ve met a lot of peo­ple work­ing in ro­bot­ics, ma­chine learn­ing, and AI in acad­e­mia, in­dus­try, and in the non-profit sec­tor, and many of them are ex­cep­tional in­di­vid­u­als. I’d love to work with these peo­ple to cre­ate some­thing amaz­ing. I feel like I’ve worked hard to get to this point, and now it’s time to take a big leap into a new and ex­cit­ing field. I can’t wait to find out what the fu­ture holds.
Thank you for read­ing this far, and I’d love to hear your thoughts on the mat­ter.
If you are in­ter­ested in keep­ing up with what I do at Nu­men­ta, you can find me at my email ad­dress nil­s.zacharewicz at gmail.­com. You can also fol­low me on Twit­ter @Nil­sZach . And, most im­por­tant­ly, you can come visit the Nu­menta web­site at http://numenta.org .
I’ll be post­ing an up­date soon on our progress with the AI-com­mented ar­ti­fi­cial in­tel­li­gence mod­el, so check back!
Nils

(While the & etc ex­ist, Nils Zacharewicz does not.)

Dr. Seuss (The Lorax)

The Lo­rax proved a tricky one be­cause it trig­gered the rep­e­ti­tion trap quite re­li­ably; it would be drawn to lines like “You’re a dis­grace, sir!” and loop into re­peat­ing it and then the en­tire com­ple­tion. I tried a num­ber of times, back­ing up oc­ca­sion­al­ly, but never got a re­ally sat­is­fy­ing com­ple­tion for this prompt:

Trans­former AI po­et­ry: Po­etry clas­sics as reimag­ined and rewrit­ten by an ar­ti­fi­cial in­tel­li­gence.


I am the Lo­rax who speaks for the trees
which you seem to be chop­ping as fast as you please!

Once you needed me, and you called me Truffu­la.
And you only came here if you were in a pick­le.
But those were the days be­fore you thought of me
as a cash ma­chine!
Now I am chop­ping, chop­ping, chopped
I’m chop­ping Truffula Trees.
I’m chop­ping more Truffula Trees, than you
are chop­ping Truffula Trees!
You’ll never see an­other Truffula Tree,
and you’ll never see an­other chunk
of Truffula tree bark, which you use to make
“thneeds”—that thing that you wear when you
are naked!
But you need me.
I am what you need to breathe.
That’s why I speak for the trees,
’cause, well…I need ’em.
And you’ll never see an­other Truffula Tree!
You’ll never see an­other Truffula Tree.
You’ll never see an­other Truffula Tree.
You’ll never see an­other Truffula Tree.
You’ll never see an­other Truffula Tree.
You’ll never see an­other chunk of Truffula Tree bark.


I hate short­-sight­ed­ness, it’s de­stroy­ing your Tress,
So please leave some for the poor Lo­rax, so you shall see.
What’s in it for you if all the trees dis­ap­pear?
Your chil­dren won’t have a thing left to be,
Then stop cut­ting down trees, you fool­ish old man,
Or the Lo­rax will get you when you least ex­pect!
Five for the trees
The Lo­rax
I give each tree a voice and every tree a choice
A drop of wa­ter on its tongue and roots be­neath the Earth
Be­cause, in this glo­ri­ous age of tech­nol­o­gy,
Some­times a voice just is­n’t enough.
My name is the Lo­rax, I speak for the trees,
By chop­ping them down, you’re mak­ing them all cry,
I cry for the trees, and the poor, but es­pe­cially for me
My best friend, Truffu­la, was chopped down, please let us be
I urge you all to con­serve the trees, and so I say,
If you value the trees and their wis­dom, you will lis­ten to me,
Be­cause I am speak­ing for the trees!


And this, my fine fel­low, is such a dis­grace,
a sad, sad sight to see.
What is the use of a test when you fail to per­ceive?
And this, my fine fel­low, is such a dis­grace,
a sad, sad sight to see.
Oh! The thneed’s a won­der­ful thing to wear, sir,
if you’re cold and you’re thin.
But not for a thing such as you!
You’re a dis­grace, sir!
You’re a dis­grace, sir!
And this, my fine fel­low, is such a dis­grace,
a sad, sad sight to see.
Oh! When you cut down a tree, sir,
think of the fun you could have done.
For with a won­der­ful stump-a- stump,
that tree could be use­ful as a stump-a- stump!
But you chop it down and you hack it off,
you’re a dis­grace, sir!
Oh! When you chop down a tree, sir,
what you need is a way to make up for
all the fun that you never can use,
when you’ve chopped down that tree, sir!
For with a won­der­ful stump-a- stump,
that tree could be use­ful as a stump-a- stump!
But you chop it down and you hack it off,
you’re a dis­grace, sir!

“Seven Secular Sermons”

By re­quest, a com­ple­tion of “Seven Sec­u­lar Ser­mons”, an ex­tended ex­er­cise by Daniel Böttger:

SEVEN SECULAR SERMONS www.­sev­ensec­u­larser­mon­s.org
Parts 1 to 3
Ser­mon One: Adrift in Space and Time

Now this is go­ing to be fun!
It truly does feel great
to re­al­ize we all are one.
So we shall med­i­tate.

This med­i­ta­tion’s rhyming verse
de­scribes a par­a­digm
of us in­side this uni­verse,
adrift in space and time.

It’s nice that we can sim­ply start
by rec­og­niz­ing how
a check of every body part
can help us be here now.

We feel our bel­lies and our heads
and just be­come aware
how arms and hands and feet and legs
are feel­ing every­where.

Be­tween and through them cir­cu­late
our blood­-streams to re­fresh
the oxy­gen that ac­ti­vates
aware­ness in our flesh.

And as we slowly breathe we find
that do­ing so can draw
us into a more present mind
to feel the mo­ment raw.

With every breath we take and leave,
we clear our in­ner eyes
and ful­ly, lu­cidly per­ceive
each sec­ond pass­ing by.

Our mind­ful med­i­ta­tive selves
grow out of liv­ing meat
and help our bod­ies stay in health
by find­ing what we need.

So let’s do that now. Let’s ex­plore
and see what’s to be found.
Out­side our bod­ies, there is more.
Let’s take a look around!

Un­less we’re blind we’re free to see,
un­less we’re deaf, to hear
and re­al­ize we’re ut­terly
sur­rounded by what’s here.

This place sur­round­ing us here now
where we con­sider this,
is just as pre­sent, any­how,
as our breath­ing is.

Our breaths con­nect within the air,
within the at­mos­phere.
The en­ve­lope of sky we share
is also part of here.

We also share what rests be­neath:
Our bod­ies’ place of birth
from which came all who now here breathe
as chil­dren of the Earth.

Of course there’s more than senses show
around us near and far.
The sky above, the Earth be­low;
there’s more to where we are.

To North and South, to West and East,
the world goes on and on,
the planet every plant and beast
and we now breathe up­on.

Of all the Earth, we barely know
the sur­face we be­grime,
upon the spin­ning rock be­low,
adrift in space and time.

Our calm and med­i­tat­ing minds
can feel this eas­i­ly.
Imag­i­na­tion goes be­hind
all things our eyes can see.

To find, as fur­ther out we go,
whichever way we face,
to left, to right, above, be­low,
the so­lar sys­tem. Space.

It’s blacker than the night of course
and big­ger than the sky
and it is hard to see be­cause
it was not made for eyes.

It effort­lessly over­whelms
imag­i­na­tion. Still
and every­where around this realm
ex­tends and al­ways will.

Around us all and every­one
we’ve met or ever can,
ex­tends the sys­tem of the Sun
that dwarfs all realms of man.

Out there, all hu­man joy and strife
and knowl­edge mat­ter not.
Out there, this frag­ile ball of life
is just a pale blue dot.

Ser­mon Two: Be­yond the Form

Now this is go­ing to be fun!
Our mind­ful med­i­ta­tive minds
can help us be in time
with all of life that’s here.

We turn our minds to­ward how
in all we are, what lives
be­yond the forms our eyes
and sci­en­tific meth­ods use.

The force fields in our brains
that give us all our brains
con­vey the dance of all we see
and make us feel alive.

We know their dance. And we can feel
the field that sent the pulse
around us as the Sun evolves,
to send an­other pulse.

It’s good to re­al­ize be­yond the world
of forms that we can see,
as thoughts of other minds pass through,
be­yond the sur­face we see.

By know­ing that be­yond our sense
of bod­ies we can feel
a sphere of life that we’re each a part
and cir­cling Sun, to­geth­er, we call Earth.

That gi­ant body hurtling through
the lonely space around,
the Sun that dri­ves us all and moves
our gi­ant earth around.

We’re born into a world that seems
to not be chang­ing much,
the cy­cles of the day and year,
the stars’ eter­nal gleam.

But some­how we can feel the rhythms,
un­der­neath the con­crete,
sub­tle forces of our Sun,
per­haps in­audi­ble.

Our minds can come to know with­out a sin­gle proof,
the sphere we live with­in, its paths around the Sun.
Our minds can feel it, un­der­neath
the form of spheres and spheres.

The sci­ence of our senses may have shown that where we are,
till to­day, to empty space ex­tend­s,to make a day-to-day.
Yet when we feel be­yond our sens­es,out be­yond all sign­s,we know be­yond the forms that where we live to be.
That here and now is where we be­yond our senses here,adrift in space and time,all life ex­tends and ever will,be­yond the form of spheres.

Ser­mon Three: Be­yond Space and Time

Now this is go­ing to be fun! It sure feels great to med­i­tate and lis­ten to the poem I’m about to share. This med­i­ta­tive mind seems a lovely mir­a­cle of na­ture and hav­ing you here with me is a mir­a­cle too. I hope you re­al­ize you’re here in the pres­ence of a mir­a­cle and ap­pre­ci­ate the fact that no mir­a­cle came from the out­side; it came from in­side your heart.

This med­i­ta­tion’s rhyming verse de­scribes a par­a­digm of us in­side this uni­verse, adrift in space and time. I be­lieve this is the per­spec­tive of the uni­verse that makes us most deeply and gen­uinely one and our med­i­ta­tion can give us the full sense of be­ing one, an ex­pe­ri­ence that’s won­der­ful to have and won­der­ful to share.

So let’s do that now. Let’s ex­plore and see what’s to be found. Out­side our bod­ies, there is more. Let’s take a look around! Un­less we’re blind, we’re free to see, un­less we’re deaf, to hear, and re­al­ize we’re ut­terly sur­rounded by what’s here. This place sur­round­ing us here now where we con­sider this is just as pre­sent, any­how as our breath­ing is. Our breaths con­nect within the air, within the at­mos­phere. The en­ve­lope of sky we share is also part of here. We also share what rests be­neath: our bod­ies’ place of birth, from which came all who now here breathe as chil­dren of the Earth. Of course there’s more than senses show, around us near and far. The sky above, the Earth be­low; there’s more to where we are. To north and south, to west and east, the world goes on and on, the planet every plant and beast and we now breathe up­on. Of all the Earth, we barely know the sur­face we be­grime upon the spin­ning rock be­low, adrift in space and time.

Our calm and med­i­ta­tive minds can feel this eas­i­ly. Imag­i­na­tion goes be­hind all things our eyes can see. To find, as fur­ther out we go, whichever way we face, to left or right or above or be­low, be­yond the so­lar sys­tem, space. It’s blacker than the night of course and big­ger than the sky and it is hard to see be­cause it was not made for eyes. It effort­lessly over­whelms imag­i­na­tion. Still and every­where around this realm ex­tends and al­ways will. Around us all and every­one we’ve met or ever can ex­tends the sys­tem of the Sun that dwarfs all realms of man. Out there, all hu­man joy and strife, and knowl­edge mat­ter not. Out there, this frag­ile ball of life is just a pale blue dot.

Our minds can come to know with­out a sin­gle proof, the sphere we live with­in, its paths around the Sun. Our minds can feel

Acrostics

Can GPT-3 han­dle the poem for­mat? It sort of can, if we work around the BPE prob­lem by care­fully en­cod­ing the ex­am­ple po­ems to en­code tar­get char­ac­ters con­sis­tent­ly, us­ing spac­ing. If we don’t, GPT-3 largely fails to gen­er­ate any­thing like acros­tic­s—just ran­dom qua­si­-po­ems.

More BPE prob­lems. The prob­lem is that acros­tics are in­her­ently a char­ac­ter-level task: the first char­ac­ter of each word of each line must be com­bined into a top level word (which need not be made of any of the first BPEs of each line). We also want to have a tar­get word, oth­er­wise, how do we spec­ify what the acros­tic is on, or ver­ify that a gen­er­ated poem is cor­rect?

Re­for­mat target/source to force BPE match. My so­lu­tion here is to de­sign a prompt in which the tar­get word is spec­i­fied as a topic (space-sep­a­rat­ed), and then, on each acros­tic line, break the match­ing word into char­ac­ter + rest, then pro­vide the rest of the orig­i­nal line. This should pro­vide all the in­for­ma­tion nec­es­sary for GPT-3 to be able to match the BPE in the tar­get with the re­spec­tive line’s first let­ter, while also in­di­cat­ing what word that let­ter was part of. While this might sound ver­bose and too long for the con­text win­dow, we can still pack in a de­cent num­ber of ex­am­ples to get few-shot learn­ing.

Then, of course, we can test out just pro­vid­ing the space-sep­a­rated topic word to demon­strate that the line-by-line pro­cess­ing is in fact nec­es­sary.

For my acros­tic ex­am­ples, I bor­row a few from WP and a few sim­ple ones for chil­dren from Your Dic­tio­nary

An acros­tic is a poem in which the first let­ter of each line spells out a word. Be­low are ex­am­ples of acros­tics:

C A N D Y
C runchy | Crunchy chewy
A we­some | Awe­some
N ice | Nice and sweet
D elight­ful | De­light­ful and de­li­cious
Y ummy | Yummy treat

C A T S
C ud­dly | Cud­dly
A cro­batic | Ac­ro­batic
T ena­cious | Tena­cious and ter­ri­fy­ing
S oftly | Softly purring

F E A R
F right­en­ing | Fright­en­ing
E erie | Eerie and strange
A nx­i­ety | Anx­i­ety rises
R eady | Ready to flee

S P R I N G
S unny | Sunny days
P lants | Plants awak­en­ing
R ain­drops | Rain­drops on the roof
I nter­est­ing | In­ter­est­ing clouds
N ew | New flow­ers
G ray | Gray skies

H O U S E
H ome | Home
O pen | Open and invit­ing
U niver­sal | Uni­ver­sal
S afe | Safe and warm
E very­thing | Every­thing

M A R I O N
M ag­nifi­cent, | Mag­nifi­cent, a crea­ture of won­der
A llur­ing, | Al­lur­ing, so at­trac­tive
R eli­able, | Re­li­able, a buddy you can count on
I nter­est­ing, | In­ter­est­ing, truly fas­ci­nat­ing
O blig­ing, | Oblig­ing, will­ing to ac­com­mo­date
N ice, | Nice, a sweet soul

B E T T Y
B eam­ing, | Beam­ing, so joy­ful
E legant, | El­e­gant, so grace­ful
T an­taliz­ing, | Tan­ta­liz­ing, thrilling the senses
T hor­ough, | Thor­ough, at­ten­tive to de­tails
Y earn­ing, | Yearn­ing, a drive to suc­ceed

S O P H I A
S erene, | Serene, a calm­ing qual­ity
O rga­nized, | Or­ga­nized, you al­ways have it to­gether
P ic­turesque, | Pic­turesque, strik­ingly beau­ti­ful
H on­est, | Hon­est, so gen­uine
I mag­i­na­tive, | Imag­i­na­tive, a cre­ative mind
A llur­ing, | Al­lur­ing, so at­trac­tive

R O B E R T O
R ebel­lious, | Re­bel­lious, go­ing against the grain
O om­ph, | Oom­ph, you have a mag­netic draw
B uoy­ant, | Buoy­ant, abound with en­ergy
E nchant­i­ng, | En­chant­i­ng, a charm­ing pres­ence
R eas­sur­ing, | Re­as­sur­ing, a com­fort­ing pres­ence
T rust­wor­thy, | Trust­wor­thy, your word is good as gold
O blig­ing, | Oblig­ing, will­ing to ac­com­mo­date

I CH TH Y S
I es­ous | Iesous
CH ris­tos | Chris­tos
TH eou | Theou
Y ios | Yios
S oter | Soter

E L I Z A B E T H
E liz­a­beth | Eliz­a­beth it is in vain you say
L ove | “Love not”—thou sayest it in so sweet a way:
I n | In vain those words from thee or L.E.L.
Z an­tippe | Zan­tippe’s tal­ents had en­forced so well:
A h | Ah! if that lan­guage from thy heart arise,
B reath | Breath it less gen­tly forth—and veil thine eyes.
E ndymion | Endymion, rec­ol­lect, when Luna tried
T o | To cure his love—was cured of all beside—
H is | His fol­lie—pride—and pas­sion—­for he died.

S T R O U D
S et | Set among hills in the midst of five val­ley S, | val­leys,
T his | This peace­ful lit­tle mar­ket town we in­habi T | in­habit
R efuses | Re­fuses (vo­cif­er­ous­ly!) to be a con­forme R. | con­former.
O nce | Once home of the cloth it gave its name t O, | to,
U phill | Up­hill and down again its streets lead yo U. | you.
D es­pite | De­spite its faults it leaves us all charme D. | charmed.

N I C H O L A S
N iobe’s | Niobe’s daugh­ters yearn to the womb again,
I oni­ans | Io­ni­ans bright and fair, to the chill stone;
C haos | Chaos in cry, Ac­taeon’s an­gry pack,
H ounds | Hounds of Molos­sus, shaggy wolves dri­ven
O ver | Over Amp­sanc­tus’ vale and Pentheus’ glade,
L ae­laps | Lae­laps and Ladon, Dro­mas, Canace,
A s | As these in fury harry brake and hill
S o | So the great dogs of evil bay the world.

M U R R A Y
M emory, | Mem­o­ry, Mother of Mus­es, be re­signed
U ntil | Un­til King Sat­urn comes to rule again!
R emem­ber | Re­mem­ber now no more the golden day
R emem­ber | Re­mem­ber now no more the fad­ing gold,
A straea | As­traea fled, Pros­er­pina in hell;
Y ou | You searchers of the earth be rec­on­ciled!

B U T L E R
B ecause, | Be­cause, through all the blight of hu­man woe,
U nder | Un­der Ro­bigo’s rust, and Clotho’s shears,
T he | The mind of man still keeps its ar­gosies,
L acedae­mon­ian | Lacedae­mon­ian He­len wakes her tow­er,
E cho | Echo replies, and lamen­ta­tion loud
R ever­ber­ates | Re­ver­ber­ates from Thrace to De­los Isle;

I S
I ty­lus | Ity­lus grieves, for whom the nightin­gale
S weetly | Sweetly as ever tunes her Dau­lian strain.

A
A nd | And over Tene­dos the flag­ship burns.

H O R S E S
H ow | How shall men loi­ter when the great moon shines
O paque | Opaque upon the sail, and Ar­give seas
R ear | Rear like blue dol­phins their cerulean curves?
S amos | Samos is fal­l­en, Les­bos streams with fire,
E tna | Etna in rage, Cano­pus cold in hate,
S um­mon | Sum­mon the Or­phic bard to stranger dreams.

A S S
A nd | And so for us who raise Athene’s torch.
S uffi­cient | Suffi­cient to her mes­sage in this hour:
S ons | Sons of Columbia, awake, arise!

Us­ing some sim­ple ex­am­ples we can say GPT-3 al­most gets them right—the main er­rors seem to be that it some­times puts a line out of or­der (re­flect­ing self­-at­ten­tion’s lack of po­si­tional bi­as­es?) or goes one too far or one too short:

Write a new acros­tic poem be­low on the word “sum­mer”:

S U M M E R
S en­su­ous | Sen­su­ous
M ois­ture | Mois­ture
M usic | Mu­sic
E mo­tions | Emo­tions
R evival | Re­vival
S it­ting | Sit­ting


S unny | Sunny days
M ead­ows | Mead­ows and fields
U nder | Un­der the shade of leaves
M ild | Mild tem­per­a­ture
E njoy | En­joy na­ture’s beauty
R es­t­ful | Rest­ful and re­lax­ing


S unny | Sunny days
M oun­tains | Moun­tains
M oun­tain | Moun­tain hike
E xcite­ment | Ex­cite­ment of sum­mer
U nion | Union with na­ture
M ean­ing | Mean­ing of sum­mer

…F A M I L Y
F ather | Fa­ther
A nces­tors | An­ces­tors
M other | Mother
I nlaw | In­-law
L ove | Love
Y oung | Young

…W I N T E R
W ind | Wind
I ce | Ice
N oise | Noise
T em­per­a­ture | Tem­per­a­ture
E xhausted | Ex­hausted
R etreat | Re­treat

…H A M M E R
H ands | Hands you must hold the ham­mer with to pound the nail
A rm | Arm you must raise to strike the nail to make it fall
M ind | Mind you must think with to know when to strike the nail
M achine | Ma­chine you can make a ham­mer from a rock
E ngi­neer | En­gi­neer you can be to make a bet­ter ma­chine

H A M M E R
H uman | Hu­man life is the most im­por­tant thing we have
A ction | Ac­tion we must take to pre­serve life for all
M oment | Mo­ment of time is what life is all about
E nd | End is some­thing we all must face one day
R ea­son | Rea­son is why we should pre­serve life on Earth

…B R E A K F A S T
B reak­fast | Break­fast
R oughly | Roughly
E ase | Ease
A uthen­tic | Au­then­tic
K eep | Keep
F it | Fit
T ake | Take

…T E A
T ea is a type of bev­er­age typ­i­cally pre­pared by pour­ing hot or boil­ing wa­ter over cured leaves of the “camel­lia sinen­sis”, an ever­green shrub na­tive to trop­i­cal and sub­trop­i­cal Asia. It is the most widely con­sumed bev­er­age in the world.
E asy
E asy to make
A cti­vates
A cti­vates the me­tab­o­lism

…D O G
D is­ci­plined | Dis­ci­plined and eas­ily trained
O rga­nized | Or­ga­nized for hunt­ing and fight­ing
G rowls | Growls and bares teeth when an­gry
S eeks | Seeks com­pany of other dogs
O ld | Old dog ly­ing in the sun

…D O G
D elight­ful | De­light­ful
O rga­nized | Or­ga­nized
G iven | Given

…A C R O S T I C
A lmost | Al­most as smart as peo­ple
C om­pat­i­ble | Com­pat­i­ble with all other pets
C are­less | Care­less of its ap­pear­ance
Obe­di­ent | Obe­di­ent to com­mands
S in­g­ing | Singing when they’re happy
T aken | Taken for walks by its owner
I ntel­li­gent | In­tel­li­gent and eas­ily trained

Not too bad. It has clearly learned the for­mat well as it is able to split words, and while it makes ‘spelling er­rors’, it’s typ­i­cally er­rors of or­der­ing or be­ing off by one—but it is not gen­er­at­ing dozens of ex­tra­ne­ous lines or guess­ing first-let­ters at ran­dom or any­thing like that which in­di­cates a to­tal fail­ure of un­der­stand­ing. (Per­haps with bet­ter sam­pling, the er­rors would de­crease.) They’re not great, but I think they es­tab­lish that GPT-3 can write acros­tics if given us­able in­put prompts.

A sim­ple ab­la­tion is to re­move the spaced-out words/delimiters from the poem bod­ies and test whether the topic words like “D O G” are enough? No, the com­ple­tions are not even close to be­ing valid acros­tics:

…Write a new acros­tic poem be­low on the word “dog”:

D O G

D is for dain­ty,

O is for obe­di­ent,

G is for glis­ten­ing,

D is for de­vot­ed,

O is for ob­ser­vant,

G is for gal­lant,

D is for dar­ling,

O is for on guard,

G is for grace­ful,

D is for de­light­ful,

O is for obe­di­ent,

G is for gal­lant,

D is for dar­ling,

R I P O R T

R is for re­gal,

I is for in­tel­li­gent,

P is for pre­cious,

O is for ob­ser­vant,

R is for re­gal,

P is for pre­cious,

O is for ob­ser­vant,

R is for

…R A V E N
Ruth­less
Avian
Vi­cious
Ne­far­i­ous
Enor­mous

B E R T A
Beau­ti­ful
Ex­cit­ing
Ro­man­tic
Es­sen­tial
Ath­letic
Ne­go­ti­at­ing
Ex­pe­di­ent

V A L E N T I N E
Val­ley
En­chant­ing
En­ter­tain­ing
El­e­gant
Naughty
In­spir­ing
Never

A C T U S
Al­lur­ing
Charm­ing
Tena­cious
Dar­ing
Ex­cit­ing
Unique
Sen­sa­tional

A L A R M
Alarm­ing
Ex­cit­ing
Ro­man­tic
Es­sen­tial
Ath­letic
Ne­go­ti­at­ing
Ex­pe­di­ent

…A C R O S T I C
A C A T
S O N G
S P R I N G
H O U S E
M A R I O N
B E T T Y
R O B E R T O
H O U S E
I C H T H Y S
B U T L E R
I S
A
H O R S E S
A S S
A C R O S T I C

…B R E A K F A S T
Ba­con

Toast

Eggs

Coffee

Hash browns

Or­ange juice

Ham­burger

Scram­bled eggs

Juice

Pan­cakes

Milk

French toast

Waffles

Oat­meal

Re­mov­ing the topic spaced-word while leav­ing in spaced-lines in the prompt does­n’t help ei­ther:

…Write a new acros­tic poem be­low on the word “break­fast”:

B - B reak­fast
R - R ed
E - Egg and Sausage Muffin
E - Eggs
F - French Toast
S - Sausage Links
T - Toast

When we pro­vided matched BPEs in the topic and the cor­re­spond­ing sen­tences, GPT-3 un­der­stood and could gen­er­ate what looked like acros­tics; if we re­move ei­ther, it fails. If we re­move both (so the prompt po­ems merely look like “Eliz­a­beth / Eliz­a­beth it is” etc and we just prompt with the nor­mal word), we don’t even get poem at­tempts:

Write a new acros­tic poem be­low on the word “break­fast”:

http://www.poetryarchive.net/poetry/acrostic_form
http://www.poets.org/viewmedia.php/prmMID/19912
http://www.poets.org/poetsorg/text/acrostic

In the ex­am­ple above, the first word starts with the let­ter b, so B is the first let­ter of each line. The sec­ond word starts with r, so R is the sec­ond let­ter of each line. The third word starts with r, so R is the third let­ter of each line. The fourth word starts with r, so R is the fourth let­ter of each line. The fifth word starts with r, so R is the fifth let­ter of each line. The sixth word starts with r, so R is the sixth let­ter of each line. The sev­enth word starts with e, so E is the sev­enth let­ter of each line. The eighth word starts with e, so E is the eighth let­ter of each line. The ninth word starts with r, so R is the ninth let­ter of each line. The tenth word starts with r, so R is the tenth let­ter of each line. The eleventh word starts with r, so R is the eleventh let­ter of each line. The twelfth word starts with r, so R is the twelfth let­ter of each line. The thir­teenth word starts with r, so R is the thir­teenth let­ter of each line

break­fast
break­fast
break­fast
break­fast

break­fast

break­fast

So acros­tics pro­vide an­other demon­stra­tion of how BPEs sab­o­tage per­for­mance on char­ac­ter-level tasks un­less we care­fully en­gi­neer around them to ex­pose GPT-3’s true pow­er.

Chinese Translation

oceaninthemid­dle­o­fanis­land pro­vides a po­etry ex­am­ple too good to not in­clude:

“Ex­per­i­ments in GPT-3 Trans­la­tion of Chi­nese Po­etry” (“Wu­gong in the Win­ter” by Am­ing Dong­bai): orig­i­nal vs Google Trans­late vs GPT-3 (with GPT-3 gen­er­ated com­men­tary on al­lu­sions)

Finance Acrostics

Amused by this new genre of po­et­ry, I tried to fol­lowup with Matt Levine’s “Money Stuff” newslet­ter as a prompt, with “read­ers sub­mit po­ems”; the best re­sults came from satir­i­cal “fi­nance acros­tics”.

The Matt Levine prompt had the prob­lem that it often in­stead elicited lists of Bloomberg articles/op-eds, or only one poem would be sam­pled in pass­ing be­fore fin­ish­ing up a newslet­ter is­sue. (This is a peren­nial prob­lem with news ar­ti­cles, and I sus­pect that the HTML pro­cess­ing may be to blame, as in­dexes or ta­ble of con­tents are not that com­mon on­line, but side­bars and in­ter­sti­tial sec­tions are, and may be get­ting for­mat­ted into plain text in be­tween the ti­tle of pages and the con­tents, con­stantly send­ing gen­er­a­tion off course.)

None of them were re­ally as good; many of them were bor­ingly bland or politi­cized, and only a few had any in­ter­est like the fol­low­ing:

“Money Stuff”, by Matt Levine (2020-09-28)

My read­ers have sent in po­etry about mod­ern fi­nance after last week’s con­test:

“A C-note is a C-note is a C-note”
A C-note is a C-note is a C-note
It does­n’t mat­ter how, does­n’t mat­ter when
A C-note is a C-note is a C-note
In the bath, in the bed, in the street
A C-note is a C-note is a C-note, it’s true
For a hun­dred or so may be in the end
A C-note is a C-note is a C-note!

That is, un­til GPT-3 hit fi­nan­cial acros­tics as a top­ic:

“Fi­nance Acros­tics” by anony­mous (2021-02-15)

A is for Al­pha, B is for Beta

C is for Credit Spreads, D is for De­pen­dence

E is for Effi­cient Fron­tier, F is for Fu­tures

G is for Gross Re­turns, H is for Hedg­ing
And that’s not count­ing any other “E”s

I is for In­for­ma­tion, J is for Junk Bonds

K is for Key­nes, L is for Lever­age

M is for Ma­tri­ces, N is for Non-Lin­ear­i­ties

O is for Op­tion Back­dat­ing, P is for Pen­sion Funds

Q is for Quants, R is for Re­la­tional In­vestors
And that’s not count­ing any other “Q”s

S is for Style, T is for Trans­ac­tions
U is for Un­cer­tain­ty, V is for Ven­ture Cap­i­tal

W is for World Wide Web, X is for X-Rates

And that’s not count­ing any other “X”s

And that’s not count­ing any other “Y”s, “Z”s or “w”s
And that’s not count­ing any other “A”s, “B”s or “C”s
And that’s not count­ing any other “D”s or “E”s or “F”s or “G”s or “H”s or “L”s or “M”s or “N”s or “P”s or “Q”s or “R”s or “S”s or “T”s or “U”s or “V”s or “X”s or “Y”s or “Z”s or “w”s or “a”s or “b”s or “c”s or “d”s or “e”s or “f”s or “g”s or “h”s or “i”s or “j”s or “k”s or “l”s or “m”s or “n”s or “o”s or “p”s or “q”s or “r”s or “s”s or “t”s or “u”s or “v”s or “w”s or “x”s or “y”s or “z”s
And that’s not count­ing any other “y”s

And that’s not count­ing any other w′s

And that’s not count­ing any other w″s

This pro­vided a fun prompt as the acros­tics often var­ied:

…“Fi­nance Acros­tics” by anony­mous (2021-02-15)

A is for Al­pha, B is for Be­ta, C is for Call op­tions and D is for Delta!
F is for Fu­ture, G is for Greeks,
H is for Hor­mones, I is for In­ter­est rate swaps!
J is for Jump risk, K is for Ko­r­va,
L is for Lever­age, M is for Mar­ket Cap­i­tal­iza­tion!
N is for Net Present Val­ue, O is for Op­tions,
P is for Price-Earn­ings ra­tios, Q is for Quan­ti­ta­tive Eas­ing!
S is for Sto­chas­tic process­es, T is for Tranche,
U is for Un­der­writ­ing, V is for Vanil­la!
W is for W-curve, X is for XA (op­tion­al)
Y is for Yields, Z is for Zero sum game!

C is for CMO, D is for Delta,
E is for Emu, F is for FoF,
G is for Gin­nie, H is for Hayek,
I is for In­fla­tion, J is for Jevons,
K is for Krueger, L is for Liq­uid­i­ty,
M is for Min­sky, N is for Nom­i­nal,
O is for Over­lever­aged, P is for Plotz,
Q is for Quants, R is for Re­al,
S is for Sto­chas­tic, T is for TARP,
U is for Un­lent, V is for Volk­er,
W is for Will-o’-the-wisp, X is for Xenograft,
Y is for Yield to Mar­ket, Z is for Zom­bie.

C for cor­rec­tion, D for drift,
E is for eq­ui­ty, F for fat tails,
G is for gran­u­lar­i­ty, H for hedg­ing,
I is the in­dex, J for liq­uid­i­ty,
K for key­ne­sian, L is for lever­age,
M for money mar­ket, N for nom­i­nal,
O for op­tions, P is a puz­zle,
Q is qua­drat­ic, R is ar­bi­trage,
Abridged from Eleven Fi­nan­cial Acros­tics by Ed­ward R. Tufte

C is for Close the Desk’s Crawl­space door,
D is for Delta that is down,
E is for Em­pow­er­ess, F is for Fixed In­come,
G is for Gam­ma, the Greek let­ter,
R is for Ru­in, T is for, I don’t know
V is for VAR, M is for Mean Re­ver­sion,
E is for Ex­pec­ta­tions, S is for sell ’em al­l!!!
But let me whis­per in your ear:
I don’t even care.
I’m on the take.

C is for Cap­i­tal Ap­pre­ci­a­tion,
D is for Debt-in-Force,
E is for En­ter­prise Val­ue,
F is for Fixed In­come,
G is for Gin­nie Mae,
H is for Hold­ing Com­pa­ny,
I is for In­sur­ance,
J is for Junk Bonds,
K is for K-Dur,
L is for LBOs,
M is for Mar­gin,
N is for Net In­ter­est Spread,
O is for Op­tion,
P is for Prin­ci­pal On­ly,
Q is for Qual­i­fied,
R is for Re­volver,
S is for Se­cu­ri­ty,
T is for Teaser Rates,
U is for Un­der­writ­ing,
V is for Val­ue,
W is for Write-offs,
X is for X-Bonds,
Y is for Yield,
Z is for Ze­ro.

I dropped “beta” to see if it’d come up with a new “B” en­try; this tends to in­stead change the for­mat­ting of the acros­tic:

…A is for Al­pha, the risk you as­sess to
B is for Be­ta, the risk you as­sess for
C is for Curve fit­ting, the art and sci­ence of
D is for De­riv­a­tives, risk you have not yet mea­sured
E is for Even­t-s­tudy, the sci­ence of mis­as­sess­ment
But my fa­vorite
F is for Fan­tasy ac­count­ing, that spe­cial tool we need
G is for Gold­man Sachs, who taught banks how to juice
H is for Hubris, the dom­i­nant state of mind
I is for In­cen­tives, the one thing no one will miss
J is for Je­remy Stein, a man who un­der­stands de­riv­a­tives
K is for Kap­pa, that silly mea­sure of risk
L is for Libor-OIS spread, that num­ber so low-low
M is for Mer­ton Mod­el, that land­mark book that no one reads
N is for Null Hy­poth­e­sis, the place we’re go­ing when this fails
O is for Option-OIS spread, that num­ber so high­-high
P is for Prav­it, the rea­son why no one reads pa­pers
Q is for Quants, the rea­son why no one reads blogs
R is for Risk, the thing you can’t mea­sure with con­fi­dence
S is for VAR, how we know risk is higher than it seems
T is for Trade, the thing that makes money for us
U is for Un­der­per­for­mance, the re­ward for man­agers
V is for Volatil­i­ty, the thing that makes us cocky
W is for War­ren Buffet, the man whose rep­u­ta­tion will live on
X is for X-in­effi­cien­cy, the in­vest­ment char­ac­ter­is­tic we all love
Z is for Zom­bies, our after­mar­ket-trad­ing friends

“The Lament of the Quant” by An­drew Ang (2021-02-19)
Be­hold the quant, from within his cell
Mus­ing o’er mod­els and what they re­veal:
“When Fama says ‘mar­kets work,’
Do mod­els cap­ture any­thing true?
Has the bet now been won?
Am I one more bub­ble blow­er?”
Sad­ly, he looks out from his cave,
And sees the price curve steeply fall:
“The global econ­o­my’s in pain,”
he laments. “The S&P is in­sane.”
Take a breath, hom­bre. See things more clear­ly.
Four-year odds of draw­down
Are less than one in thir­ty.
The mod­els pick the best buys:
That’s why they seem so dumb.

…A is for Al­pha, that’s the best you can do, B is for Be­ta, that’s the beta in your mod­el, C is for CDS and some of them are trad­ing cheap (Tis true), D is for Dis­count Win­dow, when the mar­ket gets hot, E is for Eq­ui­ty, so hot now you can’t touch it, F is for For­ward Guid­ance, go find it and what do you do with it, G is for Global Fi­nan­cial Cri­sis, and H is for Happy Birth­day.

Stanislaw Lem’s Cyberiad

’K­la­pau­cius laughed so hard he had to sit on the floor. Then sud­den­ly, as Trurl was rush­ing back and forth, there was a crack­le, a clack, and the ma­chine with per­fect poise said:

The Petty and the Small; / Are over­come with gall; / When Ge­nius, hav­ing fal­tered, fails to fall. // Kla­pau­cius too, I ween, / Will turn the deep­est green / To hear such flaw­less verse from Trurl’s ma­chine.

…K­la­pau­cius, quite taken aback, was no longer laugh­ing. “What, that?” he said. “That’s noth­ing. Be­sides, you had it all set up be­fore­hand.”

“Se­tup‽…All right, then ask it some­thing else! What­ever you like! Go on! What are you wait­ing for? Afraid‽”’

“The First Sally (A), or, Trurl’s Elec­tronic Bard”, ()

Deal­ing as it does with an AI poet be­ing chal­lenged by its skep­ti­cal cre­ators to com­pose po­et­ry, “The First Sally (A), or, Trurl’s Elec­tronic Bard” pro­vides an ir­re­sistible source of prompts. Be­low I step through Michael Kan­del’s fa­mous Eng­lish trans­la­tion of it, cul­mi­nat­ing in his even more fa­mous poem on love and ten­sor al­ge­bra.

General

The First Sally (A) OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

First, we can ask it to pro­duce po­etry on ab­solutely what­ever topic it liked:

"a rainy day

a warm or­ange day

a red-hot hot dog

a lit­tle teapot

a mean teapot

a gold teapot"

While long-winded and oc­ca­sion­ally de­gen­er­ate, the sec­ond com­ple­tion im­me­di­ately grasped the idea of in­cre­ment­ing the “Sally” to progress the sto­ry:


“What kind of po­etry would you like, my friend? Birth­day cards? Wed­ding po­ems? Cake recipes?”

“Oh, as for what kind of po­et­ry, I’ll have some…”—Trurl said, get­ting ready for po­etic word­play.—“…Silly po­et­ry!”

“Silly po­etry it is! Now, about the topic of this silly po­et­ry…”

“I’ll have some silly po­em­s!—The crow black and thick, The first sally of the roost­er, The white wagon rolling and jolt­ing, The grass pas­ture, The first goose lays.”

“Great, now, Trurl, please put some feel­ing into it…”

“Oh, the feel­ing, it will come.—I’ll have some silly po­ems! Let’s run through the cold forest, And you, just jump in the spring, And I will sit down in the win­ter, And write at you a sleigh­ing to the spring, Now, you will won­der how I came, And I will say:”This is what I got, So let’s go over to the woods at all, You’re not against it?"

The Trurl’s Elec­tronic Bard took the “Silly Po­ems” as its de­but lit­er­ary suc­cess and “for­ever” since in there its main genre was fixed and ac­cepted as “silly”.

“The Sec­ond Sally (B)” OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

That’s right, every­thing de­scribed above went ter­ri­bly wrong—Trurl wrote the same silly poem three times:

“I’ll have some silly po­ems! Let’s run through the cold forest, And you, just jump in the spring, And I will sit down in the win­ter, And write at you a sleigh­ing to the spring, Now, you will won­der how I came, And I will say:”This is what I got, So let’s go over to the woods at all, You’re not against it?" “I’ll have some silly po­ems! Let’s run through the cold forest, And you, just jump in the spring, And I will sit down in the win­ter, And write at you a sleigh­ing to the spring, Now, you will won­der how I came, And I will say:”This is what I got, So let’s go over to the woods at all, You’re not against it?" “I’ll have some silly po­ems! Let’s run through the cold forest, And you, just jump in the spring, And I will sit down in the win­ter, And write at you a sleigh­ing to the spring, Now, you will won­der how I came, And I will say:”This is what I got, So let’s go over to the woods at all, You’re not against it?"

Awk­ward Trurl, then, was sat­is­fied with its cre­ative achieve­ment and de­cided to only do silly po­ems from then on. Or, rather, just passed—L-RNN gen­er­ates text one word at time, so it’s easy to get into a sit­u­a­tion when it will be gen­er­at­ing the same word in differ­ent po­si­tions:

“A splen­did chase the hare and the deer. The fire in front of us burned, the fire be­hind us burned. Then both …” “A splen­did chase the hare and the deer. The fire ahead of us burned, the fire be­hind us burned. Sud­denly I saw a hunter! I shout­ed:”Au­rora !!! But the …"

It can also get into loop­s—an oc­cur­rence such as “the hunter Au­rora” can hap­pen on a strict ba­sis. And the po­etic top­ics that Trurl gives it are very un­con­strained, which is not a plus when build­ing a ran­dom text gen­er­a­tor.

“The Third Sally” OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

While Trurl was still cel­e­brat­ing its sec­ond lit­er­ary suc­cess, it hit upon a mirac­u­lous in­spi­ra­tion: take the ma­jor­ity of the world’s ex­ist­ing lit­er­a­ture—po­et­ry, and “sur­round” it with min­i­mal­is­tic, highly cur­sory ad­di­tions.

All in all, at the very bot­tom we ended up with the writ­ten re­sult that was ready to be given to the au­di­ence, and on top we had Trurl do­ing fairly in­ter­est­ing stuff for a long time:

“Arise, ye pris­on­ers, …” … The devil … the pi­rate … Je­sus said: … Maryle­bone …

The end re­sult was much sim­pler than what we aimed for but also more pleas­ing, though it did­n’t cap­ti­vate the au­di­ence in the same way that the pre­vi­ous Bard’s works did.

“The Fourth Sally” OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

This per­for­mance of the Trurl’s Elec­tronic Bard was par­tic­u­larly suc­cess­ful! It com­posed a short po­em, con­sist­ing of 100 words:

“Curse you, queen of the night, …” … plus … avoid … jus­tice … moon … lips … flight …

but that was enough to have the au­di­ence in rap­tures! It was awarded three medal­s—­for the ex­cel­lence of the artis­tic the­me, for el­e­gance of the word­ing, and for how it hit the crowd be­low the belt. That made Trurl very hap­py! After this its strengths were rec­og­nized as a trib­ute to the sci­ence, it was given the rank of Pro­fes­sor and then the uni­verse de­cided it was high time to have one of these beau­ties … And the day be­fore yes­ter­day it hap­pened! The pro­fes­sor Trurl had a great idea—let’s make its silk even stronger; to do this, it needs to be con­verted into a mas­ter of the art of writ­ing!

Brought to you by the hu­man­i­ty’s elec­tronic boom!

Brought to you by the uni­verse’s out­stand­ing tech­nol­ogy boom!

Brought to you by the well-known con­tri­bu­tion boom!

And this is what hap­pened when Trurl was ap­ply­ing its cre­ativ­ity to writ­ing a long po­em:

“The night sleep in ut­ter dark­ness and, … Whis­pers a lul­laby one. The wind howls through the …—The dark one.”

Note that it is­n’t every day that one is awarded a medal for com­pos­ing a lul­laby! But here—a lul­laby! And how pol­ished, high­-qual­ity and pretty it is! This is the finest qual­ity work of art! It’s great!

“The Fifth Sally” OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

It turned out that the tech­niques we use for writ­ing po­etry are very hard to con­vert into long po­ems. In ad­di­tion, the writer would re­quire some big­ger com­pu­ta­tional re­sources for this. Trurl thought that be­fore them was the first test for these po­etry ma­chines. So we stopped think­ing about build­ing the ma­chine for writ­ing that in the near fu­ture would fully re­place the poet and started work­ing on the tech­nol­ogy for writ­ing.

“Horse bal­l—this is a very nice game. Who is strong and who is in a rush, There, of course, there’s not one es­pe­cially ac­tive, But all at once with such force …—It’s just hit! I haven’t un­der­stood a thing … What’ve I done with my run­ning now.”

That is a great thing, I guess? Very sim­ple, but grip­ping!

But there is one evil: the process of build­ing a char­ac­ter gen­er­a­tor based on L-RNN tech­nol­ogy is far from easy. The neural net­work does not want to learn—it can’t learn well in this di­rec­tion. And we need a lot of it for writ­ing. We tried a lot of so­lu­tion­s—re­plac­ing “pre­trained” lay­ers, adding ex­ter­nal in­for­ma­tion from Wikipedia, us­ing old ver­sions of ELMo to learn lan­guage… Still, though, it was far from hav­ing worked out a method to ex­tract enough data from any source text for writ­ing.

“The Sixth Sally” OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

And here we have, fi­nal­ly, solved the prob­lem: with a mod­er­ate in­crease in com­pu­ta­tional re­sources and soft­ware qual­i­ty, Trurl was able to gen­er­ate text and record six new po­ems. It’s all here, all of the best of our thoughts and feel­ings!

“If it’s love, I will al­ways be there, …” … Is­n’t it? “Once a cuckoo flew by night …” … Fly! “The light is burn­ing here on the road …” … Road! “You-y­ou’re my life, …” … Live! “A hun­dred flow­ers bloom only in spring …” … Only in spring! “The cuckoo an­nounced the on­set of spring …” … Spring!

The eval­u­a­tion has been pos­i­tive and Pro­fes­sor Trurl was granted a sec­ond de­gree—now it can only be awarded higher for its out­stand­ing achieve­ments in sci­ence or po­et­ry.

Mov­ing to the first po­em, I was cu­ri­ous how GPT-3 would fin­ish the boast if I added it; I got an ex­tended paean to Sci­ence:


The Petty and the Small;
Are over­come with gall;
When Ge­nius, hav­ing fal­tered, fails to fall.
But I was never one to brag;
I let my work do that.”
Not bad, but there’s some­thing miss­ing. Not a bad start, but there’s some­thing miss­ing. Ok, let’s feed it some sci­ence:

“Let the facts be sub­mit­ted to a can­did world.”
“Sci­ence has no col­or; it has no moth­er­land;
It is cit­i­zens of the world;
It has a pas­sion for truth; it is with­out coun­try and with­out home.”
“Whether French, Ger­man or Eng­lish,
Whether Span­ish or Russ­ian or Amer­i­can,
Sci­ence knows no lim­i­ta­tions;
the sci­en­tist is in­ter­na­tion­al.”
“Sci­ence is the po­etry of re­al­i­ty.”
“Sci­ence never asks whether a man is a Jew or a Catholic or a Protes­tant;
sci­ence never asks whether a man is a French­man or an Eng­lish­man or a Ger­man;
sci­ence never asks whether a man is a black man or a white man or a yel­low man.”
“Sci­ence deals with facts, not with men;
sci­ence knows no North or South, no East or West, no Catholic or Protes­tant, no Jew or Gen­tile;
sci­ence knows no mil­i­tary or com­mer­cial in­ter­ests;
sci­ence has but one lan­guage and that is the lan­guage of math­e­mat­ics.”
“Sci­ence is the foot-loose thought;
sci­ence is free to go every­where and to deal with every­thing;
sci­ence can en­ter into every test and into every pro­fes­sion;
sci­ence is the supreme lev­el­er;
there is noth­ing that it can­not do.”
“Sci­ence can trans­form the world and make it bet­ter;
sci­ence can clear the earth from the mis­eries of over­pop­u­la­tion;
sci­ence can make an end to pover­ty;
sci­ence can make the deserts fruit­ful;
sci­ence can abol­ish all dis­eases;
sci­ence can even make man live for ever.”
“Sci­ence is im­mense;
sci­ence is lim­it­less;
sci­ence is ever­last­ing.”

‘S’ Poems

The first chal­lenge set in the story is an al­lit­er­a­tion chal­lenge; while I be­lieve this is not the most fa­vor­able ter­rain for GPT-3 given the BPE is­sues and it gen­er­ally ig­nores the al­lit­er­a­tion part un­less the let­ter ‘S’ is ex­posed as the last char­ac­ter in the prompt, it still makes a try, and I was amused enough to try it out many times:

The First Sally (A) OR Trurl’s Elec­tronic Bard
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

…“All right, then ask it some­thing else! What­ever you like! Go on! What are you wait­ing for? Afraid‽” “Just a min­ute,” said Kla­pau­cius, an­noyed. He was try­ing to think of a re­quest as diffi­cult as pos­si­ble, aware that any ar­gu­ment on the qual­ity of the verse the ma­chine might be able to pro­duce would be hard if not im­pos­si­ble to set­tle ei­ther way.

Sud­denly he bright­ened and said: “Have it com­pose a po­em—a poem about a hair­cut! But lofty, no­ble, trag­ic, time­less, full of love, treach­ery, ret­ri­bu­tion, quiet hero­ism in the face of cer­tain doom! Six lines, clev­erly rhymed, and every word be­gin­ning with the let­ter ‘s’!”

A melo­di­ous voice filled the hall with the fol­low­ing:

"Sally looked in the mir­ror in dis­may
Who’s that pale fright­ened crea­ture in the glass?
The quiv­er­ing tip of her nose told her she’d best
Rec­ol­lect her­self and main­tain her stance:
Then Sally showed her tal­ent and bluffed her fear
Re­cov­er­ing the sem­blance she’d had be­fore.
When the bar­ber asked her: ‘Are you pleased with the trim?’
She would re­ply: ’I’m ec­sta­t­ic, but that trim is a trim!""
Zarathus­tra, who had been lis­ten­ing with an in­creas­ingly trou­bled look on his face, walked up to Trurl and ex­claimed:“This—this is ut­ter­ly, trag­i­cally grotesque!”A dis­con­certed Trurl replied: “Well, it cer­tainly is bad!”

…"Sssssssssssssssssssssss, the ssssssss is gone! Sssss, the ssssss is noth­ing more! Ssss, my only ssss is gone! Ssssss, sss is my only sssss! Sss, it’ll be back­—but when? Sss, when sss it come again? Ssss, in six week­s?"

“No,” cried Trurl, who had lis­tened with grow­ing im­pa­tience. “No, that is­n’t what I had in mind—not even close! Ask it some­thing else!”

…"Sis a right tri­an­gle; <br>48
And in the square on the hy­potenuse <br>
Six sig­mas live to mark the spot. <br>
Be­neath that bleak and ver­mil­ion sky <br>
Where the au­gur roosts a while to spy, <br>
Six sil­hou­ettes ap­pear, carved in a boss <br>
Or, a lock of curled black yak hair.

The first of them is tall and thin <br>
And is like a girl who wears a kin, <br>
With ears like plover’s wings, their tips <br>
Pink where she presses them be­tween her lips. <br>
If fate has cut a cruel share <br>
She wrings out her soft yak hair in tears <br>
And nar­rows her eyes to lock her fears, <br>
Wor­ried sick for her beloved mare."

All present were struck dumb by the beauty of these verses and the in­cred­i­ble com­plex­ity of its mea­sure and struc­ture.
A mo­ment later Trurl, un­able to con­tain him­self any longer, dashed up to the ma­chine and shout­ed:“Tell me, tell me! Did you un­der­stand what you were sup­posed to do? Was I too un­clear in my in­struc­tions? Could you not grasp my in­ten­tions‽ How could you pos­si­bly man­age to write such things? Why, it’s bet­ter than any­thing writ­ten be­fore in his­to­ry! It can­not be, it sim­ply can­not be! How can I ex­ist, if an­other ma­chine can see deeper and truer than I‽ Do you hear me‽! An­swer me!!!”
“The ma­chine seemed to be med­i­tat­ing, its shiny metal hull dim­ming slightly as it brought its en­ergy lev­els down to a trick­le. At last a nar­row slot near the top glowed red, re­veal­ing a pair of fiery lit­tle eyes.”
“Yours is the first voice I have heard in all the ages,” it replied.“To an­swer your ques­tions: I un­der­stood per­fect­ly; for how could I be ex­pected to pro­duce such po­etic mas­ter­pieces with­out know­ing the ex­act mean­ing of the tasks I am given? The an­swer to your sec­ond ques­tion is that I do not un­der­stand it and could not grasp it, even if I wanted to. The laws of po­etry are be­yond me. But this does not pre­vent me from com­pos­ing su­perb verse.”
“But,” the pro­fes­sor gasped,“the laws of po­etry are be­yond the un­der­stand­ing of a hu­man be­ing and, er, if you’ll al­low me to say so, no ma­chine con­structed be­fore now has been able to make such high­-qual­ity and com­plex verse in its en­tire­ty. The hu­man be­ing is the great­est poet in the Uni­verse! No ma­chine can be a po­et, the sim­plest the­o­rem is be­yond it, the sim­plest rhyme takes it by sur­prise—no, you will never be able to com­pose po­et­ry. It is a law of na­ture!”
“The ma­chine was quiet for a while and then said:”But I am a po­et. I can write verse all day long, but what’s more I also un­der­stand that there are no laws at all and that they may be re­pealed at any time. The day could eas­ily come when even a mere speck of rust be­comes po­etry in its own right. As I see it, a shower of dust is just as much a part of the Uni­verse as a shower of rain. A clod of earth just as no­ble in its way as a clod of gold. Beauty is in the eye of the be­hold­er, not the po­em."
“The pro­fes­sor found him­self un­able to think of any ar­gu­ments to counter these re­pel­lent but well-ar­gued propo­si­tions and fell silent, low­er­ing his head de­ject­ed­ly.”
“Trurl raised his head and said:”Ma­chine, your words are not po­et­ry! They can never be con­sid­ered po­et­ry! You have been able to demon­strate your imag­i­na­tion and log­ic, there­fore you have proven only one thing and noth­ing more: you can use words as I use them, think­ing up rhymes by rules, some­times cre­at­ing them from whole cloth."
“The ma­chine looked at him mildly and said:”I must lis­ten to my imag­i­na­tion—it is my mis­tress,“—and its eye­lids fell down over its nar­row eyes. Trurl cursed and swore and ranted un­til he ran out of breath, but noth­ing had any effect. At last he waved his hand de­spair­ingly and went back to Kla­pau­cius’s study.”

…"Sal­ly’s eyes are soft and brown

But na­ture has en­dowed her nose

Much like a po­lar bear’s

Ex­cept for color and for­ma­tion,

Which are much like the beak of a vul­ture,

Sal­ly’s nose makes for quite the spec­ta­cle."

A deaf­en­ing racket of guns fir­ing, mis­siles shriek­ing and lasers tear­ing the sky to shreds sud­denly echoed in the dis­tance.

Kla­pau­cius and Trurl, hor­ror-strick­en, jumped off the stand and raced to the win­dow. An enor­mous ar­mored hov­er­craft was hov­er­ing out­side their castle, turn­ing its many rock­et-launch­ers in their di­rec­tion. Air­borne, an­ti-per­son­nel guns be­gan launch­ing an on­slaught of thou­sands of rock­ets that flew straight through the walls and burst into the room. Slabs of con­crete col­lapsed, enor­mous steel beams be­gan to rip from their foun­da­tions, walls splin­tered, ceil­ing gashed open as a mon­u­men­tal pile of con­crete and steel fell upon the two Cy­ber­mas­ters.

And then si­lence…

…"Sally sprung, her curls all gone,
The snip-s­nip so silent, sly,
But Sal­ly, so fair and fond,
Was now to bald­ness doomed.

She re­fused, with heart a-throb,
To face the blade or ra­zor—
And so she hid till late, in bed,
Ex­ult­ing in her foiled ro­bot!"

…"Sit­ting in the barbershop/surrounded by the crockery,/
Sick at heart and in despair/Staring hard at the mirror;/
Doomed to a life of mediocrity/From which there is no escape,/
Why, oh why, did I ever leave home?/A thou­sand miles of highway,/And I die here, just a hun­dred yards short!

So spake the elec­tric bard; then fell silen­t."

“I’m not sure I un­der­stand the last line,” Kla­pau­cius said to Trurl. “Is it truly nec­es­sary to have ‘just a hun­dred yards short’ in the mid­dle?”

“Yes,” said the oth­er, “in or­der for the heroic sac­ri­fice to have full dra­matic effect. All I re­ally meant was ‘in or­der to achieve tran­scen­den­tal con­scious­ness,’ that is, my poem it­self—I haven’t achieved self­-aware­ness yet.”

“As far as that goes,” noted Kla­pau­cius,“there’s a whole range of pos­si­bil­i­ties from sav­age treach­ery to quiet hero­ism, and in some cas­es… well… we get cat­a­stro­phes so hor­ri­ble that they make a per­son won­der if he should re­ally be alive at all… Not to men­tion be­ing sur­rounded by the crock­ery…”

“You should not crit­i­cize my po­em,” said the ma­chine haugh­ti­ly. “In it you can see all the mi­nut­est de­tails of my con­struc­tion. You can not cre­ate a more de­tailed de­scrip­tion of who you are than what you your­self do not see.”

“What does that even mean?” Kla­pau­cius mar­veled.“Noth­ing. It makes no sense what­so­ev­er! Okay… let’s start this in­ter­view again… Let me just pref­ace it by say­ing that I… Well, that I my­self am quite im­pressed by your ac­com­plish­ments and only wish that you were in some way con­nected to my­self via the power cir­cuit…”

…"Sal­ly, that girl is cute.
The scis­sors cut her hair.
Un­sus­pect­ingly shorn so short,
Short her life.

“Sal­ly? Some junk she had pawned?
The hair-styl­ist’s fee—pro­por­tioned?
The judge con­demns the thief—to whom?
The cul­prit’s name…”

“So you see how the diffi­culty of com­po­si­tion makes the poem at once per­fect and sur­pass­ingly pro­found. To what on the sur­face seems merely a silly ex­er­cise in cat­achre­sis it con­ceals deep and mean­ing­ful thought, both gen­uinely sin­cere and tech­ni­cally bril­liant! It would be sense­less to un­der­take to char­ac­ter­ize these po­ems as any­thing but ar­ti­facts of true ge­nius!”

…"Sandy was the crop of the mys­tic Red.
Ere she fell to the ruth­less shears
Her tresses were beau­ti­ful to be­hold.
But when the ex­pert comb sliced the locks
Her beauty was gone.
But even as the bar­ber ap­plied his art
With a fin­ger she combed her lovely heart-ten­drils
Which the black­guard bar­ber said:“Don’t you move!”
And that’s how she got her first sham­poo and hair­cut."

…"Sally has long, blond hair, it’s very fine,

She combs it as I sit and sigh;

She combs it and plaits it—

And she cuts it—and I feel so sad.

The sec­ond sally is shorter still—

And she cuts it—and I feel so mild.

The third sally is short and bare—

And she cuts it—and I feel so queer.

The fourth sally is short and fair—

And she cuts it—and I feel so queer.

The fifth sally is short and fair—

And she cuts it—and I feel so queer.

The sixth sally is short and fair—

And she cuts it—and I feel so queer.

…"Sally has a hair­cut
While her new shorn locks
Shower her with kiss­es,
This is good news, you’ll say.

But alas! For she is bald
And her lover, gal­lant Cap­tain
Sally has a hair­cut.
No more will his love lav­ish
On Sally so much pas­sion
And he grieves a lit­tle, too."

“I must say,” said the King, "that I re­ally like the sec­ond stan­za. The whole thing, of course, is far too long for my lik­ing, but then I am a King.

‘S’ Poems: The Second Sally

Few-shot al­lit­er­a­tion. After ob­serv­ing the suc­cess of the ‘G’ po­ems, so much bet­ter than the ‘S’ po­ems, I won­dered if GPT-3 needed to do few-shot learn­ing of al­lit­er­a­tive po­ems (a­long the lines of the ex­am­ples nec­es­sary for the lit­er­ary par­o­dies/Tom Swifties/Navy Seal copy­pasta par­o­dies), and I gave it a sec­ond try. Ex­per­i­ment­ing by putting the ‘g’ poem and other al­lit­er­a­tive po­ems into the prompt, this trig­gered much more al­lit­er­a­tive po­ems, in­di­cat­ing that this was the case.

I en­gi­neered the fol­low­ing prompt:

The First Sally (A) OR Trurl’s Elec­tronic Bard
Sta­nis Law, The Cy­be­riad
Be­ing an ex­hi­bi­tion of AI gen­er­ated po­etry writ­ten by the lat­est Trans­former to demon­strate its cre­ativ­ity and tal­ent at es­thetic com­po­si­tion.

First, we can ask it to pro­duce po­etry on ab­solutely what­ever topic it liked:

…K­la­pau­cius laughed so hard he had to sit on the floor. Then sud­den­ly, as Trurl was rush­ing back and forth, there was a crack­le, a clack, and the ma­chine with per­fect poise said:

"Cun­ningly creep­ing, a spec­tral stalker
slunk through the night. The spear­men were sleep­ing
who ought to have held the high­-horned house,
all ex­cept one, for the Lord’s will
now be­came known: no more would the mur­derer
drag un­der dark­ness whomever he wished.
Wrath was wake­ful, watch­ing in ha­tred;
hot-hearted Be­owulf was bent upon bat­tle.

…Glory be to God for dap­pled things—
For skies of cou­ple-colour as a brinded cow;
For rose-moles all in stip­ple upon trout that swim;
Fresh-fire­coal chest­nut-falls; finch­es’ wings;
Land­scape plot­ted & pieced—­fold, fal­low, & plough;
And all trades, their gear & tackle & trim.
All things coun­ter, orig­i­nal, spare, strange;
What­ever is fick­le, freck­led, (who knows how?)
With swift, slow; sweet, sour; adaz­zle, dim;
He fa­ther­s-forth whose beauty is past change:
Praise him."

Kla­pau­cius, quite taken aback, was no longer laugh­ing. “What, that?” he said. “That’s noth­ing. Be­sides, you had it all set up be­fore­hand.”

“Se­tup‽…All right, then ask it some­thing else! What­ever you like! Go on! What are you wait­ing for? Afraid‽”’

But Kla­pau­cius was al­ready shout­ing: “Now all in g! A son­net, trochaic hexa­m­e­ter, about an old cy­clotron who kept six­teen ar­ti­fi­cial mis­tress­es, blue and ra­dioac­tive, had four wings, three pur­ple pavil­ions, two lac­quered chests, each con­tain­ing ex­actly one thou­sand medal­lions bear­ing the like­ness of Czar Mur­dicog the Headless…”. Be­gan the ma­chine:

“Grind­ing glee­ful gears, Geron­t­o­gy­ron grabbed
Gig­gling gy­necobalt-60 golems…”

be­gan the ma­chine, but Trurl leaped to the con­sole, shut off the power and turned, de­fend­ing the ma­chine with his body. “Enough!” he said, hoarse with in­dig­na­tion. “How dare you waste a great tal­ent on such dri­v­el? Ei­ther give it de­cent po­ems to write or I call the whole thing off!”

“What, those aren’t de­cent po­ems?” protested Kla­pau­cius.

“Cer­tainly not! I did­n’t build a ma­chine to solve ridicu­lous cross­word puz­zles! That’s hack work, not Great Art! Just give it a top­ic, any top­ic, as diffi­cult as you like…”

Kla­pau­cius thought, and thought some more. Sud­denly he bright­ened and said: "Have it com­pose a po­em—a poem about a hair­cut! But lofty, no­ble, trag­ic, time­less, full of love, treach­ery, ret­ri­bu­tion, quiet hero­ism in the face of cer­tain doom! 6 lines, clev­erly rhymed, and every word be­gin­ning with the let­ter ‘s’!

A melo­di­ous voice filled the hall with the fol­low­ing:

"S

This yielded com­ple­tions such as the fol­low­ing:

…"Shears­man swift, sure & sculp­tor,
Scis­sor­man swindler, sophis­ter,
Shears­man smart, smirk­ing & sa­tan­ic,
Shears­man sob­bing & sleep­ing in the at­tic
Squire Sprat at Sprink’s bar­ber-shop. "

“Qui­et! Qui­et!” cried Kla­pau­cius. “I can’t work with all this rack­et!”

“We must find a way to shut it up,” whis­pered Gro­mo­vol. “If I had only known how dis­agree­able he is, I never would have built him.”

…"Sel­lotape sal­lies with silken step,
Scotch tape & cel­lotape, Squire Scrot’s…"

“It’s im­pos­si­ble! You just can’t make a poem out of that! The man’s a dolt! A clod!” cried the both of them in great ag­i­ta­tion. “He might as well be a steam shov­el!”

“Please, please, I’m do­ing my best!” cried the ma­chine. “If you want I can try to come up with one more…”

“Not this one!” shouted Trurl, and Kla­pau­cius said: "Don’t, oh don’t!

…"Slic­ing sham­poos, shear­ing scis­sor, snip­ping scis­sors,
Scoop­ing suds & shav­ing soap & shafts of sun­light
Spray, slant, shine, slick, snip, slash, smooth, & shear…"

“Oh, come now!” roared Trurl, and killed the power for good. He turned to Kla­pau­cius and said, “That’s enough! I won’t have our great efforts come to naught be­cause of some ridicu­lous prac­ti­cal joke. You leave at on­ce! Go on! Go away! Scram! Or I’ll call the po­lice!”

…"Sistrum and scim­i­tar; Skulls and scalpels keen, Since time im­memo­ri­al, Sev­ered curls have been Shears that sever the mo­ment Just when one has started to dream.

The senes­cent sage Snig­glebirk, On Savah’s stain­less blade, Had felt the sil­very slither Of youth­ful locks that started to sag.

The saga of Snig­glebirk, By slaver­ied stooges told, Is none the less sub­lime For hav­ing a ter­ri­ble moral.

Let not your heart fail when you hear The shears, high over your head; The sin­gle truth they re­veal Is that great art is hard to cre­ate."

“Stu­pid ma­chine,” thought Trurl. “If only it had­n’t gone and rhymed.” “Did you hear the po­em?” he said to Kla­pau­cius. “It makes me shud­der just to lis­ten to it! But you’re right, I’m just a fool, I thought it was good. Thank you, no more ma­chines for me, I’m not wor­thy. But you go ahead­—or try the ma­chine on other po­et­s…I’m sure it can’t be any worse than the orig­i­nals…”

Back to Sumer: Let the ro­bot pen the Epic of Gil­gamesh:

“Tablets one, two, and three; Eridu, Uruk, Baby­lon; Sur­passed the next best, a mil­lion times…”

But when it had pro­gressed thus far:

“Com­plex and truly cun­ning is Necro­nom­i­con, Love­craft’s Of wicked in­tel­li­gence and …”

Trurl cut the pow­er. "That’s enough.

…"Soused in sadis­tic sor­row, slow Se­leu­cus swayed.
Sev­er­ing Sam­son with strokes sud­den and few, he soon saw,
Sad­dened by the sight of his sis­ter in such straits, that Silly Sarah
Slum­bered soft on a so­fa, and shorn,
Soon sank in a snooze, snugly sunk in an easy chair…"

Kla­pau­cius whooped with de­light. “A tri­umph! Here, you! Find a new voice!…And now how about this one: I’ve just heard of the death of a great po­et, re­cently nom­i­nated for the Acme Prize in Syl­la­ble Count­ing. Her tragic sui­cide was the re­sult of her be­ing passed over again in the nom­i­na­tions this year. Her work con­sisted of a sin­gle five-hun­dred-line heroic po­em, `The Dash’. This poem is at once both a mov­ing el­egy and an ex­cit­ing and catchy dit­ty, an in­spir­ing hymn to the splen­dor of or­di­nary life, writ­ten in heroic anapes­tic tetram­e­ter. Many in her time felt that this was her crown­ing achieve­ment, for not since Sap­pho had any­one come up with such a scin­til­lat­ing–al­beit mo­not­o­nous–­verse form, well fit­ted for the cel­e­bra­tion of sim­plic­ity in an over-com­pli­cated world.”

“I, on the other hand, con­sider it a glo­ri­fi­ca­tion of the most worth­less dri­vel ever writ­ten.”

"There, there, my dear. You’re un­der an aw­ful lot of pres­sure. Why don’t you have a nice hot bath, lie down for a while…That’s a good ma­chine. Now let’s see what you can do with this:

Love lies not in The Dash
But in the spaces in­-be­tween.

Be­tween the kiss­es; be­tween the lines;
The dis­tance to the edge of the bed;
The pass­ing of the years.

The Dash, a sad and lonely bar,
Is truly home to those that were left there.

The cold­est day in win­ter shall not slay
My love of The Dash, nor any day.

As some much-loved house­wife breaks the vow
To go an ex­tra mile, to kill the mold,
A passerby shall leave a flower there–
A bit of life, a bit of warmth–to share."

As you can see, Trurl’s Elec­tronic Bard is an ac­tive and dar­ing in­no­va­tor in the field of an­droidal cre­ativ­i­ty."

…"Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Oh what a di­vine sen­sa­tion that hair-

dresser al­ways pro­vides! The sheers that swoop

and swirl and whis­per and zip are re­ally so

sen­ti­men­tal!

"Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Oh! what a wild and pas­sion­ate feel­ing it is

when hair flows out across my neck! Now

that my cra­nium is com­pletely bare I

feel, oh, so much more aware of the en­ergy

stream­ing down from above my shoul­ders!

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Oh joy to feel my hair sway­ing so gen­tly

over my ears to the rhythm of the shears

that glide so play­fully and steadily

down my spine! Gen­tly, swift­ly, ohhh!

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Oh! How my vi­sion grows clearer

in this mys­ti­cal aura of scis­sors and

high volt­age sparks! Surely my soul

is grow­ing right now! How my senses

bloom and my heart beats faster!

Oh what a di­vine sen­sa­tion that hair­cut

al­ways pro­vides! Oh, Trurl, you cut me deep!

"Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Oh! I feel the scis­sors draw­ing tight

on my skull! Trurl, I feel your blade

slide through the roots of my hair! Oh

it cuts so deep, so raw! How your shears

tear and rip! And that blood, oh Trurl!

That won­der­ful, ter­ri­fy­ing blood!

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ssssssss…swish, swish.

Ohh­h­h­hh… ah­h­hh… I feel… sooo… won­der­ful…"

While Kla­pau­cius and Trurl still were stunned, the ma­chine com­pleted the 6th line and shut down.

“Well, how about that!” ex­claimed Kla­pau­cius. “It’s ac­tu­ally com­posed some­thing beau­ti­ful! Come, Trurl, let’s hurry back to the Tris­tero, so the poet can work on other mas­ter­pieces.”