Creatives Doing Manual and Robots Creating

Scrolling through FB for five minutes this morning, I came across no less than four ads offering support for generating AI novels and getting them published via Amazon. One even claimed to provide tools for writing and publishing at least one book daily. Many of my colleagues fear AI will replace us—as born out by the article in the Guardian, November 19th, about HarperCollins selling their back catalogue to a multinational big data concern so they can train AI models how to write. I can think of only one reason publishers would become involved in training AI: to reduce costs by removing writers from the production cycle. However, that is not an overnight transition.

What was an overnight transition was that wannabe fast-buck makers flooded the market with so much dross that readers could not trust sellers like Amazon and stopped buying from all but “trusted” suppliers, which are the same suppliers now providing back catalogues to AI modellers. When publishing a book via Amazon, there is a checkbox to indicate whether anything included in the book was AI-generated, including editing, text, graphics, and translation. They do not, however, make the information public—or if they do, no one is using AI, or at least not admitting to it.

So, what does it all mean?

It will still take some time before AI will be able to seamlessly mimic human writers, if ever. In the meantime, there will be a flood of low-quality content that will overshadow the works of human professionals. The flood will lead to a decline in the overall quality of literature available to readers. Publishers will continue to pay writers until they can replace us with robots, but it will become ever more difficult for new writers to get a publishing deal. Indie publishers like PerchedCrowPress face an ever-increasing gradient because of a lack of trust from buyers. There is already an overriding belief that Indie is equal to bad quality, which could not be wider of the mark. Imagine how a wave of low-quality content hiding behind an indie barrier will erode what little trust might remain.

So, what can we do about it?

The onus is on companies like Amazon to enforce rules about disclosing AI-generated content and publicise this information on their platforms. Some may argue that enforcement would be difficult, but it wouldn’t be. A simple program similar to those used to detect plagiarism may suffice. Issuing penalties to publishers who fail to disclose their use of AI could also discourage this practice. However, we understand that companies like Amazon are unlikely to regulate anything they see as a future revenue source. As writers, our hope lies in legislative bodies compelling them to enforce transparency in the content they sell.

Summary

Okay, so those offering tools making it possible to publish a novel a day are conning the susceptible. Quickly generating content is possible, but packaging and publishing a book takes much longer. 

Packaging a book, including formatting, creating covers (even with AI-generated images), and writing a blurb takes time. Some will be faster than others, but it is longer than a day. When producing a book, I always allow two weeks in the plan for packaging. 

Publishing also takes more than a day. Amazon cites 72 hours from when you hit the Publish button to when it becomes available, and sometimes it takes longer. Admittedly, it can be quicker, but with the thirty books I have published, the average is about 36 hours.

However, the susceptible will not realise that until they try it. By then, they are as likely to publish poorly packaged content as they are to publish the content itself, exacerbating the problem.

So, suppose legislators want to avoid the scenario where people are reading only AI-generated content; they must be discerning in their choices and support initiatives that promote human creativity in literature, if not stemming the flow of AI-generated literature, at least making its presence transparent.