• Welcome to the Fantasy Writing Forums. Register Now to join us!

On using LLMs in writing

skip.knox

toujours gai, archie
Moderator
I recently read a pretty good article in Ars Technica that relates one person's use of AI to write a bit of code. Those interested can read the whole article (link below), but I found his observations relevant to what we do around here. I'll quote the key paragraph, then add a few comments.

"These tools as they exist today can help you if you already have competence. They cannot give you that competence. At best, they can give you a dangerous illusion of mastery; at worst, well, who even knows? Lost data, leaked PII, wasted time, possible legal exposure if the project is big enough--the 'worst' list goes on and on"

These comments feel both relevant and important. Transposing a bit, the passage says if you already write well, at the level of competence, then the tools can be useful. You will not he does not say a word about the tools being morally admirable. Just useful. The tools cannot, however, bestow competence upon the incompetent. Or, I shall add, on the lazy, the over-eager, the woefully naive, or the unscrupulous.

I'll venture further afield. Don't expect AI to make you better than you are. Don't expect it to teach you anything if you are not already adept at learning. [ex-professor aside: learning is itself a craft that must be learned; it's not merely an event that happens]

And, at least for the time being, if you aren't up to the job going into the job, there's a fair chance the job is going to be botched. Perhaps worse, the AI will be good enough to fool you into thinking you're done, when in fact you're done for.

So, handle with care. I won't say here be dragons, but the dragons are damned close by. Best sharpen your own swords first.

The path taken by the article's author to reach his conclusion is illustrative in itself, especially if you've a bit of a background in programming. It provides some good examples of How Not To.

Here's the link to the article.

 

pmmg

Myth Weaver
I'll venture further afield. Don't expect AI to make you better than you are. Don't expect it to teach you anything if you are not already adept at learning. [ex-professor aside: learning is itself a craft that must be learned; it's not merely an event that happens]

You cannot use these tools and become a better writer. Once you reach for the easy way, you will leave yourself behind.

If you gaze long enough into an abyss, the abyss gazes back into you.

AI is not for artists. It should be soundly rejected.
 
  • Like
Reactions: Ban

MSadiq

Scribe
On your point about competence, AI is terrible at understanding context, so it might suggest you edits, which is what I'd like to assume people are using it for, that are absolutely unfit for your story, but you'd not know better because you don't have the competence to know. I've used it for translation quite a bit—used to internship in a quality assurance office at the local uni as translator-proofreader—and it's not helpful, unless you can guide it. For example, it'd either translate idioms literally or translate them into senses that doesn't fit the context.

Even if you feed it a whole novel, you've not given it the full context. The FULL context isn't in the text; it's in the writer's mind—purpose, drive, target audience, and whole ton of info that's not on the page but in the writer's mind.
 
Last edited:
I think it can be a tool in a toolbox, but a tool that you know how to specifically use. It’s not like using a hammer to bash in a nail. You’ve got to know how to use that tool through practice of using it. And it’s just not up to some tasks. Doing basic code? Yes. Helping write a risk assessment? Yes. Writing a novel? Nooooo
 

Ban

Troglodytic Trouvère
Article Team
Agreed. AI can only copy the fruits of labour, not grant in any way the experience and mastery that doing labour may instill. If a writer's concern is solely with the product of labour, then AI can create one. A derivative one, empty of any novelty and not reflecting the soul of the "writer" (more so "director") of the work. There is no use for AI whatsoever if someone has respect for the craft of writing. Give it a wide berth and denounce its use, I'd say.

In less lofty terms, if (proverbial) you only care about getting somewhere, take the bus or the train. It works and is effective, but no amount of commuting will teach a person how to drive or cycle ;) Nor does the commute take you anywhere you wish. It lessens your freedom if you are reliant on it, only taking you to its designated places on its designated timeline. The car, the bike and your own mind, steered by you alone, can reach elsewhere.
 

SamazonE

Troubadour
It can only end badly. Tolkien was against industrialisation. We should have stopped at the dictionary and pirates.

Taking it a different direction, technology lies to us. What else is it for other than being a solid buffer to reality?

I am taking precaution using AI, because if it has screwed me in the past, then it has screwed me on TV, and will do so again.

So far we have a calculator and a foil hat. Soon we have a monster that cannot hurt us at all.

(Anyone notice my Skynet allusion?)
 
I've found a rather interesting use of a LLM on youtube.
She was created by a very experienced Coder who created by her creator to be a V-tuber.
I've only seen a few videos by him (and her I guess) and they've been somewhat entertaining.
Though this isn't just a already made LLM like Chat GPT or Gemini, it's one custom created/designed to be a V-tuber.
Here's the video that had me snickering the entire time.
Some of the interactions were pretty golden and had me wheezing.
 
Last edited:

Mad Swede

Auror
LLMs are sometimes presented as a threat to writers and other artists. But the truth is that they're no better than the texts (or whatever) they were trained on. There's no intelligence in LLMs, no ability to infer and interpret. LLMs just recognise and apply patterns based on the materiel used to "train" them. That makes LLMs limited. Using LLMs is like using any tool, it requires thought and it requires care. The originality and the quality comes from the person using the tool rather than the tool itself. As with any tool, it is the craftsman whose competence determines the quality of what is produced and not the tool itself.

No, I do not use LLMs nor would I ever do so.
 

pmmg

Myth Weaver
Yes, I guess our artistic integrity is safe because it sucks. LLM's may have reach their pinnacle, but AI is just getting started.
 
To start, just to make it clear, I don't use LLM's to write, and I'm not planning to. I'm not saying never, since I can't predict the future and never is a long time. But at the moment I don't see the point.

Couple of thoughts. First is that learning how to use a tool isn't the same as learning how to actually do something. I see this in math eductation in schools a fair bit. A lot of kids are taught how to use a calculator to solve math problems. Which doesn't actually teach you any maths, it just teaches you to use a calculator but you're not understanding the actual maths.

The hard thing is of course who decides which tool is allowed to be used and which isn't. If you're DYI'ing, then you could drill a hole in the wall using a hand drill, or paint your wall with a regular brush. And you'd probably find some old and very talented handyman claiming that's the best way to do stuff, and he'd show you his results and they'd probably be better. However, does that make drilling a hole by hand a skill worth learning or is the electric drill a tool worth using for most people?

Where do you draw the line between which tool are and aren't allowed to be used?

With all technological advancements these discussions happen. You'll find the "established" users calling the new technology useless or a passing thing or making things worse not better. It's how you get IBM saying "I think there is a world market for maybe five computers". It's painters claiming photography isn't a real art (and at the same time crying that it will completely remove painting as an art). Or how in old Dutch news items you have people saying they see no point in a mobile phone because they already have a phone at home or a smartphone because who needs internet on the road.

In the end, tools just are. They don't teach you a skill, and they don't turn you into something you're not. But they might make your process easier or more enjoyable or speed it up. The quote Skip mentioned above, about how the tools cannot bestow competence upon the incompetent is an important one. But having read the article, there's an equally important quote at the end:

An overwhelming majority of developers say they’re using AI tools in some capacity. It’s a safer career move at this point, almost regardless of one’s field, to be more familiar with them than unfamiliar with them. The genie is not going back into the lamp—it’s too busy granting wishes.

I don’t want y’all to think I feel doomy-gloomy over the genie, either, because I’m right there with everyone else, shouting my wishes at the damn thing. I am a better sysadmin than I was before agentic coding because now I can solve problems myself that I would have previously needed to hand off to someone else.

Do I think LLM's can write novels? No, they're not made for that and it's not playing to their strengths. Do I have ethical issues with where the material comes from? Definitely. Do I also think there might be useful ways to use AI and will using them make some writers get ahead? Probably yes. There's no turning back technology, and the cost - benefit analysis is too far in favor of AI for it to disappear.

Just two examples: editing is one. If an AI model can edit a novel to 80% of the level of a professional editor, then many people will forgo the cost of the professional. Even 50% of the way there is already a great proposition. Few people have $1.000 or more to spend on a hobby project that's unlikely to earn more than $500. Same with creating ad images. If you want to create ads and testing a few 100 of them then getting stockphotos is expensive, getting a professional agency to create them for you is prohibitively expensive. Getting an AI to create them for you is free. It's an easy choice from a monetary perspective.

Yes, I guess our artistic integrity is safe because it sucks. LLM's may have reach their pinnacle, but AI is just getting started.
Just feel the need to point out that this is only partially true. LLM's are indeed reaching the small, iterative improvements stage of development. The easy gains have already been made.

The AI is just getting started is however hype sold to investors to increase stockprices. AI research has been going on for 30+ years already. We've already been working on it a long time. So I don't just expect something to magically appear because we only just started working on it. Yes, we're still at the start, but the big strides that seemingly happened overnight had been in the making for 30 years.
 
Top