• Welcome to the Fantasy Writing Forums. Register Now to join us!

AI Trends

Demesnedenoir

Myth Weaver
Apparently, one AI sniffer is also selling how to beat AI sniffers... Of course the answer is free: Write it yourself. But some folks can't abide working.
 

ThinkerX

Myth Weaver
One of the FB groups I'm on puts up fantastical (and sometimes racy) pictures. Past few weeks they have included a good couple dozen AI generated images. Two things that stood out with these pics (and received a fair bit of comment) were the eyes (almost always kind of dead looking) and the fingers (a lot of the characters had something like eight fingers to a hand.) Apart from that, I found some of the images repetitious - variants of a monster and a little girl posed next to a castle or manor in a dark landscape.

The posted excerpt reminds me of those images.
 

pmmg

Myth Weaver
Thinking on this. When i was recently in school, there was a system (i think it was called turnitin), that would check your paper against a database of other turned in papers and scored it based on originality. Meaning, if too many of your phrases matched exactly those of others you got flagged. I suspect that would also catch ai work.
 

Demesnedenoir

Myth Weaver
Yeah, AI has a ways to go. If you look at AI artwork too close you start saying... muuuuhhhhh. And repetition will be king with AI, it's not creative, it's just slapping together human work. Architecture, body parts... dear God! asking for a horse is not a good idea, heh heh. And then, on the other hand, it will do some fantastic artwork that drops your jaw, but those tend to be portrait (no hands!) and landscape.

A friend of mine who uses AI at work (ChatGPT found obscure coding info for him after months of trying now and again to find it) thinks that at least for now, AI would be most useful in finding out what is trending, popular, or even overdone. It might even be able to point toward future trends... but, AI relies on what has been done, even if it might splat some old elements together to make it look new.
 
Apart from that, I found some of the images repetitious - variants of a monster and a little girl posed next to a castle or manor in a dark landscape.
To be fair, the same can be said for a large portion of indie covers. Usually a stock-image of a person in front of a building. Depending on genre, swap in magical aura, manor house, monster, romance clues or whatever. Alternatively, go with weapon, magical symbol, random bits of a plant or other item placed over a background with twirly bits in it (again using stock-fotos). Repetitive and close to trend is what plenty of covers are all about.

If I was a cover designer, and all I did was stick a few stock images together (yes, I know good cover designers do a lot more...), then I would be worried. And I think you would have every reason to be worried. For everyone else, I think it will be a tool which helps your productivity, which is almost always the case with technology.

And the comment about the coder friend is spot on. We as a community are focussing a lot on writing novels. But that's not where the money if for AI developers. We as authors and publishers already can't make any money selling novels. Little reason to write a program that has the same problem. There are a lot of other areas that are easier to make money in, and easier to create an AI for. Coding is a good example, but I'm sure there are hundreds.
 

ThinkerX

Myth Weaver
If I was a cover designer, and all I did was stick a few stock images together (yes, I know good cover designers do a lot more...), then I would be worried. And I think you would have every reason to be worried. For everyone else, I think it will be a tool which helps your productivity, which is almost always the case with technology.
That is pretty much what my cover artist does - I send her a rough MS Paint sketch and an 'idea' pic or three, and she tracks down and stitches together assorted stock images.
 

Demesnedenoir

Myth Weaver
Yeah, covers get repetitious for that reason on top of "what works, works." Also, so many aren't even creatively mushed around, they're a stock photo barely touched.

And absolutely true on AI making money in other industries. However, books are sneaky profitable when quantity can swing profit margin so hard. With any luck, it will be more of a productivity feature rather than "write a book".
 

Stepgingerly

Dreamer
What we need is an AI to detect if an AI wrote it.

Perhaps if the AI has memory, we could just say...Hey, is this one of yours?
We have this.

It works pretty well until the smarter students use chat-gpt as a draft and then edit themselves in. I can't confront a student when the detector says 50% chance of it being AI.
 

pmmg

Myth Weaver
Its okay. Just means their road to having anything meaningful to contribute will be harder. Its still true, when you cheat you only cheat yourself.
 
We have this.

It works pretty well until the smarter students use chat-gpt as a draft and then edit themselves in. I can't confront a student when the detector says 50% chance of it being AI.
The question is slowly changing though. At what point does it still count as cheating if it has become common practise. After all, if you fast-forward 10 years, then it's likely that it's common business practise to get an AI to gather parts of your data and order it in a logical fashion, and then you as a human edit and rewrite it to get rid of the AI oddities that likely remain.

Assuming universities are meant to prepare you for the "real" world, then would it still be cheating if you work in the same manner?

I think the boundaries will shift. As with all of these technology trends, humans will have to focus on the part where they can add value. 50 years ago, that already was just simple data-processing. A hundred years ago, that was even doing complex calculations. It shifts, it always does. And there is little point in fighting it, since someone else won't fight it and thus gain a competitive advantage. I don't doubt that AI search will become as common as a google search, and then students will simply be told to list the AI's they've used for their papers.

It will mean that the testing process will need to change. And that is the harder part, because it means that professionals who have worked in the same way for a few decades and were tested in a specific way will need to change. That's probably the hardest part here to achieve. Ways in which this will change are hard to predict with certainty, and won't be the same everywhere. But a few things could be: focus will shift from simply producing papers to hand in to actually being original in them. To how convincing your arguments can be and how readable you can make an AI text. Or it could go back to the basics. You don't need to write a 20 page article. Instead a few pages is enough. They just need to be very thoroughly researched. And you'll likely have to show real understanding of a subject, and not just the ability to generate text. Which will mean asking the student questions about what he handed in instead of simply grading a paper.

The same with writing a novel by the way. I personally expect the writing a whole novel part to be the hardest to achieve. But copy-editing should be easy enough. And your actual skill when telling a story, coming up with original ideas and believable characters will become more important. It wont be generating as much words as you can, but rather making sure the story is sound.
 

Stepgingerly

Dreamer
Yet the very act of writing a logical argument in a logical structure forces critical thinking. If AI gives the structure and logic, my students will be brain-dead. "Writing is thinking" and "writing across the curriculum" have been core tenets of an excellent education for a reason. While I agree that we can't put the genie back in the bottle, the younger kids are, the less I think they should touch this stuff.

I'm more interested in the use of AI in the future for marketing. Reading some of the threads about paid newsletters, reviews, and buying ad bundles, I wonder if there's a legitimate space here for an AI type program to link with social media sites openly in accounts to promote books and post to numerous sites at once. Could a program like that be built to optimize exposure for new novels to the best audience by genre? I'd pay for that in a heartbeat rather than flounder around, hoping to get lucky.
 

skip.knox

toujours gai, archie
Moderator
>Assuming universities are meant to prepare you for the "real" world
This is one of the more unfortunate developments in higher ed. Universities should be preparing us to be multi-dimensional human beings; otherwise, it's not education, it's merely training. I have a whole rant about that.

But to answer the question: yes, it would still be cheating. At the core of cheating is pretending to be something you are not, or pretending to have something you don't, or pretending you've done something you haven't. In short, cheating is a variety of pretense. That it helps you sell the product or get the job doesn't make it any the less pretense (or pretentious).
 
Yet the very act of writing a logical argument in a logical structure forces critical thinking. If AI gives the structure and logic, my students will be brain-dead. "Writing is thinking" and "writing across the curriculum" have been core tenets of an excellent education for a reason. While I agree that we can't put the genie back in the bottle, the younger kids are, the less I think they should touch this stuff.
I agree with you that people should learn how to do something without technology first, before using technology to do the same thing faster. Since if you know how to do something, then you become independent of technology, which makes it easier to adapt to changes. To use maths as an example, which is both more my expertise and has had this issue for 30+ years already, with the ever more powerful calculators available to people. If you teach someone how to solve an equation using pen and paper, then they'll know what they are doing and you can then easily explain how to do this using technology. However, if you hand someone a calculator and only tell them how to solve equations using that calculator, then you haven't taught them maths, you've taught them which buttons to push. And the moment the buttons change they're lost.

Which is why I said that the manner of testing students needs to change. Force students to show a real understanding of a subject, not just hand in a pile of words, where the bigger the pile the higher the grade. You can fake writing a paper using AI. You can't fake understanding if you're asked a direct question. It's why all my physics exams where done with pen and paper, where you could easily have half a page of scribbles to solve an equation. As a physicist you'd be solving more complex equations using a computer. But showing the ability to solve the easy ones with pen and paper showed understanding of the subject matter.

But to answer the question: yes, it would still be cheating. At the core of cheating is pretending to be something you are not, or pretending to have something you don't, or pretending you've done something you haven't.
I disagree. At least, if you would mention the AI as a source. As with anything, taking the work of someone else and pretending it's yours is cheating. However, if it's simply part of your workflow and you're open about it, then it's not. It will end up being an extra step in how papers are written. It will end up a tool which assists you, just like a calculator is for math issues.

As for the discussion in general, I can't help but wonder if we'll see the same curve as with self-driving cars. 10 years ago, everyone was shouting that by today we'd all have self-driving cars and driving yourself would be well on its way to become outlawed. Over the past 10 years the reality of the situation and the limitations of technology have caught up with the hype. And slowly we're starting to hear people wonder if we'll ever really get there with the current technological capabilities. We still aren't anywhere near full self-driving.

So it might very well be with AI and writing. The current technology is very impressive. And there are lots of places where it will be useful and powerful. It will change how we work. However, it's still a very big leap from where we are today to "indistinguishable from a human, award winning novel". And I doubt we'll get there with the current technology, simply because of its nature.
 
It is quite terrifying in one sense, but then it just doesn’t have that human touch. When you read a passage that AI has come up with it reads quite convoluted and derivative and you can tell it’s not human made. Perhaps it would be good for writers to use for prompts but beyond that I’m sceptical, at the moment. If technology becomes sentient then I’ll eat my hat.
 
I imagine that’s the right he decision based on the fact that AI is just an algorithm of predetermined word combinations. There no originality, and that’s its major crux
 

pmmg

Myth Weaver
This article on the ruling does not cover this, but I had read (though I dont recall where) that AI companies producing art were a risk for those who use them, because site owners were able to claim ownership of material posted to their site. So, if you create a book cover, for instance, the site owner may be able to claim that as their art and not yours, and that the material you uploaded to create it is also theirs by virtue of the terms of service.

All of which equates to, I would be wary before doing it.

This ruling says the AI art does not belong to the content creator. It does not assign who does own it. If it decided to belong to the AI host, could be a mess in the making.
 
Last edited:
Top