Pervading our culture is an idea, programmed into us on social media and in the general way people speak about and position art, history, and non-STEM academic studies (aka Humanities), is that anything that does not generate large profits is a hobby or an inconsequential pastime. STEM fields are careers, humanities fields (including journalism, writing, designing, artists, anthropologists, historians, sociologists, etc) are âjust for fun.â Despite the fact that all require degrees, all require years of study, of honing your craft and skills, of working with people more experienced in the field, and making money from said craft. The difference is that artists, educators, and writers make less money and more often censored, silenced, and even outright attacked.
One of the pervading evils this ideology has unleashed on society is the emergence of Generative AI.
Several authors have filed a lawsuit against OpenAI for illegally acquiring and using copyrighted works to train the GPT Generative AI model. OpenAI used piracy sites to illegally acquire books, and then trained their GPT model using them, without compensating the authors, or contacting authors or publishers for permission to use their books:

OpenAI allegedly used piracy to train generative AI (a model which responds to a prompt to make a text based on it, such as ChatGPT). OpenAIâs use of piracy for training the model indicates the AI is not meant to be used for good. It's as if a scientific experiment was conducted under uncontrolled or unethical conditions â the results cannot be trusted from something like that. And ironically, the results of this theft are in itself a form of piracy.
For example: as a result of this theft, I can ask ChatGPT to âwrite a story similar to the works of and in the writing style of George RR Martinâ OR (more importantly) a less successful author, then put the book online (on Amazon as a self-published novel), and make money off an AI book that is technically (as far as copyright law goes) a rip-off of said authorâs work, which would also create profits the author is not compensated for. This is a hypothetical, but there are a ton of AI books popping up online, including even AI cookbooks.
Writing style is a part of intellectual property. OpenAI copies the syntax, diction, even characters (by which I mean the types of characters/themes an author makes and is known for), and overall writing style of authors, which undermines their creative and imaginative property â the very thing which they have made a career and a living on. This is purposeful on the part of OpenAI. In this way, AI companies profit off the labor, which goes under-appreciated and underpaid even within the publishing industry, of authors. It also undermines the reputation of successful authors if fake sequels are produced, and people are fooled into believing they are real.
Piracy has always existed. AI âwritingâ is just a glossier and more socially accepted version of piracy. But why is it socially acceptable?
AI whistleblower Suchir Balaji, a former employee of OpenAI, tragically died recently. His last twitter post was about how based on his understanding of fair use and his experience working at OpenAI, OpenAI was committing copyright infringement (knowingly).
Fair use law changes based on new evidence and cases and should change to account for AI scraping. It's not fair use to take an entire book without permission and train a Gen AI model to be able to recreate a similar but ânot the exact same bookâ with impunity:
AI books will share the same or highly similar properties as the works they are being trained on â and it is commercial not only for OpenAI but for the making of AI books to be sold on Amazon for the private profit of individuals.

OpenAI was illegally acquiring entire books, not merely quotes or excerpts, to train the model. Even if they had only used quotes, the use of said materials is not fair use because the output is not being significantly rearranged, as Balaji outlines in his post. Copyright protects âthe creative choices made by an author.â
To add onto this, the creative choices that a writer makes with story and language are protected by copyright and are the very building blocks for AI to mimic real writing. If ChatGPT writes a âchildrenâs story about a girl who goes to magic schoolâ for example, that story will inevitably pull from books that have been written about a girl in magic school, in regards to incidents, characters, themes, plot and structure elements, and even writing and narration styles of specific authors (for example, how one author employs first person narration will differ from a another author. Some authors use more colorful or âpurple proseâ while others use more bare bones and straightforward language, and so forth). The model, having been trained on a writerâs story without the writerâs permission, will spit out a story written in the same exact way that the author arranged their story and will therefore not be fair use. It would be copyright infringement. If I copy and paste Harry Potter into a word doc and change the names, make the house colors and symbols different, and make Voldemort a woman, Iâd still be committing theft.
Stolen novels training AI models will also affect the output from ChatGPT as far as prose style goes. If I ask the model to rewrite the magic school story, but with more colorful prose, for example, and switch into third person, that model will then pull from the stolen novels it has been trained on (that are written in that style) and mimic those authors. This is why no âAI bookâ is a robot just âwriting a bookâ and âcoming up with its own idea.â Thereâs no such thing as a robot who wants to be a writer. (Thatâs personifying something that is not human). Books that AI âwriteâ are just imitations of authors copyrighted and intellectually/creatively unique works. AI is not âcreativeâ it is merely a synthesizer (at best) and at worst, a copycat.
OpenAI claims the novels are âpublicly available data.â Well yeah, pirating is technically âpublicly availableâ if you have access to the internet and bad intentions.
Microsoft is also guilty of stealing writing without authors permission â not the final product of writing but writing still in process. Microsoft Word is scraping documents from private users to train their AI models. Word users are literally paying Microsoft to get their unpublished works stolen from them. There is a complicated, off-the-beaten-path way of turning off the AI training. Most authors use Word to write novels, this is well known. Microsoft knows this. They are getting in on the ground floor of the theft.
To turn off AI scraping in Microsoft Word, open a word document, go to âWordâ at the top left, click âpreferences,â then âprivacy,â then âconnected experiencesâ and UNCHECK the box that allows connected experiences. This will turn off âconnected experiencesâ (AKA AI scraping). Word doesnât even outline in plain language that âconnected experiencesâ includes AI model training, which is misleading.
The lawsuit against OpenAI is not a petty gripe over profits authors want to make (in other words, this isnât the already wealthy George RR Martin being greedy, this is a larger issue that impacts for the most part, authors who are less successful and wealthy than Martin):
The unfortunate fact under-girding this phenomenon is tech companies (which are made up of tech people) are greedy and lack creativity and imagination. They want to parasitically leech from truly creative and imaginative, often empathetic people (such as writers) while undermining and devaluing said imagination and creativity as unnecessary because it can be âreplacedâ by AI. This replacement is for the express aim that people wonât be compensated for their art and will be focused only on doing labor and maximizing profits for CEOs and tech moguls, who will then lobby for less taxes and regulations, which undermines democracy, makes the poor poorer and the rich richer, and silences the voice of the average person, a voice which is expressed through the mode of art and literature (as well as journalism, although mainstream media and its problematic nature in relation to power structures is a more complex topic).
In this twisted hypothetical world, art would only be a hobby, as the popular (misguided and wrong) opinion goes. Many novels that are written throughout history, including but not limited to the most famous ones or âclassicsâ are stories which give voice to the marginalized or illuminate how oppressive power structures operate (such as Toni Morrisonâs Beloved). This is especially true for fiction but also applies to nonfiction (e.g. Malcolm X) and this truth is exemplified by the myriad of banned books which were pushed to be removed from schools by conservative pundits, religious âauthoritiesâ and right-wing political freaks.
Rasenberger identifies the aspect of authors âLearning and perfecting [our] crafts.â Just like programmers who go to university, just like engineers, scientists, anthropologists/historians (who are also overlooked because of being in humanities) and so on. Writers went to school, got degrees/masters (or even PhDs) in literature or creative writing, dedicated ourselves to our crafts, took years to develop our skills, and now apply said skills (along with creativity) to make a living. Just like everyone else. Yet itâs interesting how writers are the first ones people steal from or take credit from (ghostwriting and celebrity âautobiographiesâ being prime examples of this problem long before AI).
It is a widely spread cultural phenomenon that writing is âeasyâ and âanyone can do itâ and that simultaneously âno one should have to do itâ and âjust make a computer do it.â If itâs so easy, why bother to make a computer do it? Why train a model with well-written authors, essayists, and journalists? Perhaps because it, like any other skill, is not easy. People would prefer ChatGPT to write their emails and spit out soulless âbooksâ because they are lazy and incompetent and yet want to turn around and undervalue writers as inessential to society. As if writing has some expiration date that humans can reach, some milestone humanity can get to and graduate beyond the need for intellect, critical thinking, and imagination. Like growing beyond the need for gas powered cars or introducing solar panels. The marketing that we need AI is always based on innovation and technological progression â that human writing and human art (the only types of real writing and real art there is) is of the old world, the barbaric world, like landlines and phone books, or doing math by hand, and that AI is of the future, or worse yet, of the present.
You canât teach creativity or imagination. You can teach the craft of writing, but not the essential element that makes writing interesting, worthy of reading, and creates culture â which is creativity itself; the unique ability to create a new arrangement of language, character, and story elements, and the ability to formulate a theme(s) and message to the audience from these elements. You can teach someone how to use and manipulate technology, and that person, regardless of their level of creativity, integrity, empathy, or imagination, can follow the rules and formulas. But art and writing donât have a formula â it doesnât fall into a neat box. So naturally, those who do not understand it, or those who do understand its power in creating freedom, knowledge, and awakening people to themselves and to unequal power structures, want not only to control art, box it up, and silence it â they want to destroy it. And then profit from said destruction.
They want to remove the artist from art â but if you do that, you are left with something lifeless, soulless, objectively mediocre (from a quality standpoint) and objectively meaningless. Art's key purpose is its meaningfulness. Taking away its meaningfulness results in something that is called âArtâ by programmers and tech moguls but is in fact not art at all. It's merely a line of code that visually and intellectually assaults the eyes and mind with its very lack of quality and lack of purpose. You cannot âtrainâ a model to be creative either â when all it âcreatesâ is the spitting out of a poor rearrangement, like taking pages out of a book and resorting them. Not creating a new book, just a confused, thinly veined theft of the real thing.
Technology without humanities is...well, inhuman. Generative AI is not âFrankensteinâs monster.â The monster in the novel (which most tech people who reference the book have probably never read, or worse, read it and entirely missed its point â its purpose and meaning) is intellectually curious, he is emotional, and he craves love. He is creative, he reads, he desires to be seen as human. And his very human emotions and creativity is what makes him not a monster, but a meaningful being. Although he is created by science, he is free from the burdens of scientific formula or hypothesis once he discovers and interacts with the world. Artists and writers, therefore, have more in common with the monster. And these AI companies have more in common with Dr. Frankenstein â wanting so badly to create something in their own image, using technology, while simultaneously rejecting all evidence of humanity, as it is threatening and horrifying to them. Art threatens their power, much like how the humanness of the monster threatens the selfish-aggrandizing objectives of the Doctor.
How can authors protect themselves? A new AI copyright clause has been introduced and is being implemented in novels.
Books that contain content written with AI are required to disclose said usage. However, using AI undermines the integrity of writing as a profession and as a skill. Even if a book was partially written using AI, the author will immediately become suspect as lacking integrity, skill, and intelligence. That author will have been a thief of other writers (the ones the AI was trained on), and if their ideas for plot structure, story, or character, were generated in an AI âbrainstorm,â said ideas will also be stolen property of authors who managed to complete a novel all by themselves. No one should ever be considered an âauthorâ if they are using AI, because it completely throws the entire definition of being a writer into question.
It is not elitist to say that if you have used AI to write a book, you have not written a book. You have stolen from a real writer. (It would not be elitist to say that a scientist who lied about their results in an experiment is a fraud). And it is not elitist to say that if one âneedsâ AI to write anything, let alone a book, in the very least they are not a writer, and at worst there is a serious lack of intelligence at play. It is not elitism to identify and define what makes a writer a writer, an author an author. That âanyone can do itâ will blur the lines between real books and fraud books â especially if âwritersâ start using AI to write some but not all of a book â they will of course claim that they did MOST of the work, so they should still get the credit and be celebrated. This undermines that credit that real authors and writers deserve for writing a whole novel from scratch. Does making a shitty box cake from the drug store make me a baker? Claiming such nonsense would be an insult to bakers everywhere.
Naming any piece of writing using AI as "well-written" is an insult to all writers. Any AI written (stolen) works is an insult to writers. The same writers who are the reason such an AI model even exists in a way that is satisfactory at first read by the average person. To scrape the private writings of Word users is an insult, and to steal books without paying authors for their work and then use those stolen materials to make profits authors have a right to, is an insult, a threat, and a symptom of a growing literacy crisis that is anti-intellectual, anti-free thinking, anti-democratic, and anti-human.
Remember, art is powerful. It is revolutionary. Knowing how to critically read and think threatens oppressive systemsâwriting even more so. OpenAI and mega wealthy tech oligarchs know this and have a vested interest in theft in the short term (to make their models mimic humans better), and in the long term, to replace and censor artists and writers.
sources
Balaji, S. (2024 October 23). When Does Generative AI Qualify for Fair Use?. Suchir.net. https://suchir.net/fair_use.html
Davies, A. (2024, December 14). OpenAI whistleblower found dead in San Francisco apartment. BBC. https://www.bbc.com/news/articles/cd0el3r2nlko
Can I use Someone Elseâs Work? Can Someone Else Use Mine? Copyright.Gov | US Copyright Office. https://www.copyright.gov/help/faq/faq-fairuse.html
(2023 September 20). The Authors Guild, John Grisham, Jodi Picoult, David Baldacci, George R.R. Martin, and 13 Other Authors File Class-Action Suit Against OpenAI. The Authorâs Guild. https://authorsguild.org/news/ag-and-authors-file-class-action-suit-against-openai/
(2023 September 8). Practical Tips for Authors to Protect Their Works from AI Use. The Authorâs Guild. https://authorsguild.org/news/practical-tips-for-authors-to-protect-against-ai-use-ai-copyright-notice-and-web-crawlers/
Top 100 Most Frequently Challenged Books 2010-2019. American Library Association. https://www.ala.org/bbooks/frequentlychallengedbooks/decade2019