Demesnedenoir
Myth Weaver
There's been a lot chatter about AI, and the SFWA sent out requests for input recently, then a friend sent me this. Real world, or rather, real sci-fi world data. Much like piracy, this will be a long fight.
To be fair, the same can be said for a large portion of indie covers. Usually a stock-image of a person in front of a building. Depending on genre, swap in magical aura, manor house, monster, romance clues or whatever. Alternatively, go with weapon, magical symbol, random bits of a plant or other item placed over a background with twirly bits in it (again using stock-fotos). Repetitive and close to trend is what plenty of covers are all about.Apart from that, I found some of the images repetitious - variants of a monster and a little girl posed next to a castle or manor in a dark landscape.
That is pretty much what my cover artist does - I send her a rough MS Paint sketch and an 'idea' pic or three, and she tracks down and stitches together assorted stock images.If I was a cover designer, and all I did was stick a few stock images together (yes, I know good cover designers do a lot more...), then I would be worried. And I think you would have every reason to be worried. For everyone else, I think it will be a tool which helps your productivity, which is almost always the case with technology.
We have this.What we need is an AI to detect if an AI wrote it.
Perhaps if the AI has memory, we could just say...Hey, is this one of yours?
The question is slowly changing though. At what point does it still count as cheating if it has become common practise. After all, if you fast-forward 10 years, then it's likely that it's common business practise to get an AI to gather parts of your data and order it in a logical fashion, and then you as a human edit and rewrite it to get rid of the AI oddities that likely remain.We have this.
GPT-2 Output Detector
openai-openai-detector.hf.space
It works pretty well until the smarter students use chat-gpt as a draft and then edit themselves in. I can't confront a student when the detector says 50% chance of it being AI.
I agree with you that people should learn how to do something without technology first, before using technology to do the same thing faster. Since if you know how to do something, then you become independent of technology, which makes it easier to adapt to changes. To use maths as an example, which is both more my expertise and has had this issue for 30+ years already, with the ever more powerful calculators available to people. If you teach someone how to solve an equation using pen and paper, then they'll know what they are doing and you can then easily explain how to do this using technology. However, if you hand someone a calculator and only tell them how to solve equations using that calculator, then you haven't taught them maths, you've taught them which buttons to push. And the moment the buttons change they're lost.Yet the very act of writing a logical argument in a logical structure forces critical thinking. If AI gives the structure and logic, my students will be brain-dead. "Writing is thinking" and "writing across the curriculum" have been core tenets of an excellent education for a reason. While I agree that we can't put the genie back in the bottle, the younger kids are, the less I think they should touch this stuff.
I disagree. At least, if you would mention the AI as a source. As with anything, taking the work of someone else and pretending it's yours is cheating. However, if it's simply part of your workflow and you're open about it, then it's not. It will end up being an extra step in how papers are written. It will end up a tool which assists you, just like a calculator is for math issues.But to answer the question: yes, it would still be cheating. At the core of cheating is pretending to be something you are not, or pretending to have something you don't, or pretending you've done something you haven't.