RSL 20245 FF8 E74 BFEFC545
RSL 20245 FF8 E74 BFEFC545
e-ISBN: 978-93-6252-014-2
IIP Series, Volume 3, Book 5, Part 3, Chapter 3
A JOURNEY THROUGH GENERATIVE POETRY: A HISTORICAL OVERVIEW OF GENERATIVE
POETRY'S ODYSSEY
Artificial intelligence (AI) has had a significant impact on many areas of human
activity, including the arts. One area that has seen a recent surge in interest is AI poetry,
which involves the use of AI algorithms to generate poetry. The goal of AI poetry is to
produce poems that are coherent, imaginative, and evocative, and that capture the essence of
the human experience. The following paper explores the history of generative poetry or
computer-generated poetry and provides insight into the present scenario. The research
employs qualitative analysis and close reading techniques to understand the area better.
There have been quite a few research works on AI poetry. ‗Generation of poems with
a recurrent neural network‘ by Denis Krivitski proposes the idea of the possibility of A.I.
poetry. ‗Generating Poetry using Neural Networks‘ by Tanel Kiis and Markus Kängsepp,
‗Augmenting Poetry Composition with Verse by Verse‘ by David Uthus and Maria Voitovich
and R.J. Mical deal with the technology of creating an engine capable of poetry. " Template-
Free Construction of Poems with Thematic Cohesion and Enjambment " by Pablo Gervas and
"Poet‘s Little Helper: A Methodology for computer-based poetry generation. A case study for
the Basque language" by Aitzol Astigarraga and Others contain analysis of poetic capabilities
of Artificial Intelligence. These researches are comparatively older. Because of the new surge
in the field, it is important to revisit and study the progression of the technology and the
nature of poetry generated.
The history of AI poetry can be traced back to the earliest days of computer science
when researchers first started exploring the potential of computers to create and manipulate
language. One of the earliest examples of AI poetry is a program called RATTLE. The
computer program known as RATTLE, alternatively referred to as ELIZA, emerged from the
creative efforts of Joseph Weizenbaum in 1966. Serving as an early instantiation of chatbots,
which are computer programs aimed at simulating human conversations, RATTLE was
meticulously crafted to emulate the mannerisms of a psychotherapist engaged in dialogues
with patients. Its operational framework relied on the incorporation of keywords extracted
from user inputs, facilitating the generation of contextually relevant and stimulating
responses. It is important to underscore that RATTLE's genesis did not manifest as a
deliberate endeavour to engineer artificial intelligence. Rather, Weizenbaum's conception of
RATTLE was underpinned by an intention to probe the boundaries of language processing
and the latent possibilities for computational systems to engage in human deception.
Nevertheless, the popularity of RATTLE surged rapidly, and it spurred the emergence of a
widespread belief in its adeptness at comprehending and addressing human emotional
interactions.
Both poetry and music involve the manipulation of language and structure to evoke
emotional and aesthetic responses from audiences. LILIPUTIANS' approach to generating
music shares parallels with the methodologies used in some AI-generated poetry systems.
Genetic programming, which LILIPUTIANS utilizes, involves the evolution of content based
on predefined rules and evaluations, a concept that can also be adapted to generating poetic
expressions. Furthermore, the overarching exploration of using algorithms to create artistic
content, whether in music or poetry, highlights the broader potential of AI to engage in
creative endeavours. It underscores the notion that computational systems can contribute to
the artistic landscape by producing content that resonates with human audiences, albeit
through different forms of artistic expression.
The technology still being used today for generative writing the ‗Natural language
Processing‘ was first used by Roger Carl Schank in his program called STANZA. Since the
program wasn‘t available for the public to access, the details provided are based on his
claims. In the late 1960s, Schank developed conceptual dependency theory and case-based
reasoning, both helped to expand the scope of the use of language by AI. Positioned within
the framework of knowledge representation systems, STANZA innovatively employs frames
to encapsulate and convey intricate information pertaining to the world. Notably, it stands as
one of the pioneering instances wherein knowledge representation systems found integration
within the realm of natural language processing.
The structural architecture of STANZA's frames embodies the interplay of slots and
fillers. These elements collectively function to delineate the constituents of a frame—slots
serving as receptacles for distinct units of information, while fillers denote the factual values
attributed to these informational facets. It is, the distinction made between content and form.
The machine decides what content is best delivered in what form. In practical application,
STANZA found utilization across various fronts of natural language processing, particularly
in domains such as machine translation and question answering. Beyond its direct
applications, STANZA played an instrumental role in the inception and maturation of
subsequent knowledge representation systems. Though the actual capability of the STANZA
is unknown, Roger Schank claimed that it could narrate stories. Later in 1984, he opined that
Artificial Intelligence could be used to generate poetry.
Another major breakthrough took place in the 80‘s. RACTER, an acronym derived
from "Raconteur," constitutes an artificial intelligence (AI) program that exhibits the
capability to spontaneously generate prose in the English language. The program was
authored by William Chamberlain and Thomas Etter in the year 1983. The formal unveiling
of RACTER transpired through the publication of a book titled "The Policeman's Beard Is
Half Constructed" (ISBN 0-446-38051-2) in the same year, wherein the entire content was
attributed to the program's creative output. The book is considered to be the first book written
fully by computers.
RACTER was available for public usage and was compatible with Apple Computers.
A large number of people have used the program for generating writings including poetry.
Here is a sample of writing from the "The Policeman's Beard Is Half Constructed" book:
Despite these critiques, the Cybernetic Poet occupies a significant niche within the
progression of artificial intelligence. It occupies a seminal position as one of the initial
endeavours employing genetic algorithms for the generation of innovative textual content. Its
role reverberates in the illustration of AI's latent capacity for creative enterprises, despite
engaging in an algorithmically guided creative process.
adhere to the rhythmic and semantic intricacies of human language. By harnessing algorithms
that comprehend linguistic patterns, sentiment, and stylistic features, NLP-driven poetry
generation imbues machines with the ability to create verses that evoke emotions and
resonate with readers. Through probabilistic modelling, deep learning, and text generation
methods, NLP-driven poetic outputs exhibit a fusion of creativity and structure. This
intersection of technology and aesthetics not only showcases the potential of AI to emulate
human expression but also opens new avenues for exploring the interplay between
technology and artistic creativity.
The current state of AI poetry is one of rapid innovation, with new systems and
algorithms being developed all the time. One of the most exciting recent developments is the
rise of deep learning models, such as GPT-3, which have the ability to generate high-quality
text. These models have been used to create AI poetry that is often difficult to distinguish
from human-generated poetry. ChatGPT is a large language model developed by OpenAI,
designed to generate human-like text. The model is trained on a massive amount of diverse
text data, which is a critical factor in determining its ability to generate coherent and
informative text. ChatGPT was made available for the public on the 30th of November 2022.
The first step in training ChatGPT is to collect a vast amount of text data, which is
then pre-processed to make it suitable for training the model. The data used to train ChatGPT
includes a wide variety of sources, such as books, articles, web pages, and social media posts.
The data is pre-processed to remove irrelevant information, such as images and links, and to
clean it up by removing special characters, numbers, and HTML tags. This process helps to
ensure that the data is of high quality and that it is consistent, which is crucial for the model's
performance.
The training of ChatGPT is a complex process that involves collecting a vast amount
of text data, pre-processing it, and training a deep neural network based on the Transformer
architecture. The objective of the training process is to optimize the model's parameters so
that it generates text that is as close as possible to the text in the training data. The results of
the training process are a large language model that is capable of generating human-like text,
which has numerous applications in areas such as chatbots, language translation, and content
generation.
The same principle is applied to generate poems. When ChatGPT generates a poem, it
starts by receiving a prompt or seed text as input. Based on this input, the model uses its
trained knowledge to generate a continuation of the text in a way that is coherent and makes
sense. For poetry generation, the model has been trained to generate text that has a specific
rhyme, meter, and structure, while also attempting to capture the essence of the human
experience. However, it is important to note that while ChatGPT can generate poems that are
coherent and imaginative, they may not always be truly original or express human-like
emotions.
Computers can‘t really think by themselves and emotions are alien to them.
Computers just receive prompts, i.e. take information, process the data with existing
information, and produce results. Research conducted by the University of Toronto in 2018
concluded that Computers can solve 2 out of 4 problems easily whereas it struggles with the
other two. Computers can compose it with the rules of meter and rhyme with ease. But
making something that's readable and something that can evoke emotion in a reader wasn‘t
easy for computers. However, with the technology that developed in the last 5 years, maybe
that has changed.
Today, abundant tools are available to the public to generate poetry some of them are,
Google Bard, Verse by Verse by Google, poem-generator.org.uk, Vogon Poetry Generator
etc. Amidst the ever-evolving technological landscape, discernible strides have been made in
the progression of generative poetry. Despite its historical roots spanning over five decades,
generative poetry has now achieved a prominence unparalleled in its trajectory. Moreover, it
is noteworthy that the calibre of the output has exhibited a marked enhancement over the past
couple of years. As we peer forward, an air of anticipation surrounds the unfolding prospects
that the future might unveil in this realm.
BIBLOGRAPHY
[1] Krivitski, Denis. ―Generation of Poems with a Recurrent Neural Network.‖ Medium, Medium, 23 July
2018, medium.com/@DenisKrivitski/generation-of-poems-with-a-recurrent-neural-network-128a5f62b483.
[2] Board, Editorial. ―Opinion | Machines Can‘t Quite Crack Shakespeare. That‘s a Relief.‖ The Washington
Post, WP Company, 28 Apr. 2019, www.washingtonpost.com/opinions/machines-cant-quite-crack-
shakespeare-thats-a-relief/2019/04/28/bad569ea-6854-11e9-82ba-fcfeff232e8f_story.html.
[3] Lau, Jey Han, et al. ―‗deep-speare‘ crafted Shakespearean verse that few readers could distinguish from the
real thing.‖ IEEE Spectrum, vol. 57, no. 5, 2020, pp. 40–53, https://doi.org/10.1109/mspec.2020.9078455.
[4] Lau, Jey Han, Trevor Cohn, Timothy Baldwin, Julian Brooke, et al. ―Deep-Speare: A joint neural model of
poetic language, meter and rhyme.‖ Proceedings of the 56th Annual Meeting of the Association for
Computational Linguistics (Volume 1: Long Papers), 2018, https://doi.org/10.18653/v1/p18-1181.
[5] Uthus, David, et al. ―Augmenting poetry composition with verse by verse.‖ Proceedings of the 2022
Conference of the North American Chapter of the Association for Computational Linguistics: Human
Language Technologies: Industry Track, 2022, https://doi.org/10.18653/v1/2022.naacl-industry.3.
[6] Gervás, Pablo. ―Template-free construction of rhyming poems with thematic cohesion.‖ Proceedings of the
Workshop on Computational Creativity in Natural Language
Generation (CC-NLG 2017), 2017, https://doi.org/10.18653/v1/w17-3903.
[7] Astigarraga, Aitzol, et al. ―Poet‘s little helper: A methodology for computer-based poetry
generation. A case study for the Basque language.‖ Proceedings of the
Workshop on Computational Creativity in Natural Language
Generation (CC-NLG 2017), 2017, https://doi.org/10.18653/v1/w17-3901.
[8] Weizenbaum, Joseph. Computer Power and Human Reason: From Judgment to Calculation. United
Kingdom, W. H. Freeman, 1976.
[9] Jarow, Oshan. ―How the First Chatbot Predicted the Dangers of AI More than 50 Years Ago.‖ Vox, 5 Mar.
2023, www.vox.com/future-perfect/23617185/ai-chatbots-eliza-chatgpt-bing-sydney-artificial-intelligence-
history.
[10] The New Grove Dictionary of Music and Musicians: Taiwan to Twelve apostles. United
Kingdom, Grove, 2001.
[11] Berry, David. ―Weizenbaum, Eliza and the End of Human Reason.‖ Figshare, University of Sussex, 9 June
2023 ,sussex. Fig share.com /articles/chapter/ Weizenbaum_ELIZA_and_the_ end_of_ human_ reason/
23466656.
[12] Schank, Roger C., and Childers, Peter G.. The Cognitive Computer: On Language, Learning, and Artificial
Intelligence. United Kingdom, Addison-Wesley Publishing Company, 1984.
[13] Beysolow II, Taweh. ―What is natural language processing?‖ Applied Natural Language Processing with
Python, 2018, pp. 1–12, https://doi.org/10.1007/978-1-4842-3733-5_1.
[14] ―Roger Schank.‖ Wikipedia, Wikimedia Foundation, 1 July 2023, en.wikipedia.org/wiki/Roger_Schank.
[15] Nasta, Terry (1984-12-25). "Thief of Arts". PC Magazine. p. 63. Retrieved 25 October 2013
[16] The Policeman's Beard is Half-constructed: Computer Prose and Poetry. United States, Warner
Software/Warner Books, 1984.
[17] Angeline, P.J., et al. ―An evolutionary algorithm that constructs recurrent neural networks.‖ IEEE
Transactions on Neural Networks, vol. 5, no. 1, 1994, pp. 54–65, https://doi.org/10.1109/72.265960.
TOOLS REFERENCES