• Welcome to the Fantasy Writing Forums. Register Now to join us!

Using AI effectively

People like to come up fancy terms for things, like ‘prompt engineering’, which is basically using more than two brain cells to write a more specific prompt into chatGPT, but this still - at this point in time - does not equal to a quality of writing that can actually rival human creativity. When I was researching into submitting a short story to some online platforms, I came across many articles that told of the increased number of submissions since ChatGPT’s arrival, citing that it was becoming the majority of submissions they would receive, and what gave them away was the exact same lines and language usage. ChatGPT and other similar programmes are being used exploitatively to make a quick buck, which in turn means that AI written stuff will become value-less in the not so far future. When humans figure they can make money quickly they flock like flies to shit towards the nearest opportunity. If anything, human made writing is going to become more in demand and writers are going to get more creative, innovative and competitive.
 

pmmg

Myth Weaver
People like to come up fancy terms for things, like ‘prompt engineering’, which is basically using more than two brain cells to write a more specific prompt into chatGPT, but this still - at this point in time - does not equal to a quality of writing that can actually rival human creativity. When I was researching into submitting a short story to some online platforms, I came across many articles that told of the increased number of submissions since ChatGPT’s arrival, citing that it was becoming the majority of submissions they would receive, and what gave them away was the exact same lines and language usage. ChatGPT and other similar programmes are being used exploitatively to make a quick buck, which in turn means that AI written stuff will become value-less in the not so far future. When humans figure they can make money quickly they flock like flies to shit towards the nearest opportunity. If anything, human made writing is going to become more in demand and writers are going to get more creative, innovative and competitive.

I strongly suspect something like this will play out. But AI is a moving target. It also gets better over time, and becomes harder to identify.

My concern is not for AI. It is for all of you, and all of us. They way you done one thing, is the way you do all. And if you are reaching for shortcuts now, you will be reaching for it when AI improves. With each iteration, AI gets better at doing what you think you need it for, and you become less important. When it arrives, there will be no need for you. Thinking, I will just get AI to write me 100, or 250, or 2000 words, and I'll just edit it and make it my own, is a path to not having anything to offer.

Already, I can assert, if you are using AI in your writing, I am thinking less of you. And have far less respect for anything you might write. I am here to advocate for you. But if all you are is dressed up, regurgitated AI, what's the point? Thought you wanted to be a writer.

Anyway...nuther AI thread. Good luck with it.
 

Fyri

Inkling
Man, I tried to get AI to tell me a story when I was sick once. Talk to me when it can write something with so much depth that it makes me cry, or at least feel something that isn't surface level.

Hmm. That is something I haven't seen brought up in these discussions.

Commonly, it has been said that stories are meant to explore or expose or whatever the Human Condition. AI is not human. Should/Can AI be expected to write stories for humans about the human condition? This may be a more useful question when AI is more sentient.
 
Another look at where ai is, and not where its going.
Well, the thread is about "Using AI effectively", which really does come down to looking at where AI is right now, not where it might be going in 5 or 10 years.

So, I'm not going to make any claims as to its abilities. But, people who advocate for using AI talk about using prompts to generate small chunks at a time, with lengthy and carefully guided prompts. You wouldn't give it a two sentence prompt and ask for 100k words. It's more like you'd give it a 500 word prompt and look at 2,000 words, which you'd have to edit, then adjust the prompt for the next 2,000 words.
I haven't read any complete AI novels, or write using AI. However, what I have seen is that AI actually starts of decently. Not amazing mind. It's still bland and formulaic. It's also better than some human writing I've seen though, so there's that.

But as you go along and move past the 2k words mark (or something like that) for the whole tale, then AI becomes progressively more vague. And that's with multiple prompts. So asking it for 500 words, and then for another 500 and another. It slowly falls apart over time. At first this isn't noticeable, but it happens. In all video's I've seen on this it's just glossed over with a "I'm just quickly showing you this". But it's a real phenomenon.

It's also completely understandable, and it has the same origin as that AI images can't be used to train AI image generators (they fall apart in a few iterations as well). AI works based on probabilities. It guesses the next word (and sentence). It's pretty good at that. I can take all of Harry Potter, and with that I can guess what J.K. Rowling's next sentence would be. In that sense it can pretend to be J.K. Rowling.

However, we're still talking about probabilities. And there are a lot of words in a novel. As you progress, even with repeated prompts, things become increasingly uncertain. An intro to a novel is easy. There's only 1 character to track and not much has happened. However, executing the half-way crisis is a lot harder, with a dozen characters and 4 plot-lines that all play some role. With each word you write the whole thing becomes more muddy. You'll move towards random words more and more. Because it's building on itself, it has less and less of a clue what it's supposed to be doing.
 
Man, I tried to get AI to tell me a story when I was sick once. Talk to me when it can write something with so much depth that it makes me cry, or at least feel something that isn't surface level.

Hmm. That is something I haven't seen brought up in these discussions.

Commonly, it has been said that stories are meant to explore or expose or whatever the Human Condition. AI is not human. Should/Can AI be expected to write stories for humans about the human condition? This may be a more useful question when AI is more sentient.
The developers get the software to pass the ‘Turing test’ which basically tests for a sort of ‘sentient’ response.
 
I think one of the major flaws with text AI is that if you ask it to write a story with political intrigue for example, it won’t actually tell you the specifics of that political intrigue, it’ll say something like ‘and Lyra Dragonscroft discovered a world of political intrigue’ - so at which point you yourself have to write the actual story, which then defeats the point.
 
Top