I recently read a pretty good article in Ars Technica that relates one person's use of AI to write a bit of code. Those interested can read the whole article (link below), but I found his observations relevant to what we do around here. I'll quote the key paragraph, then add a few comments.
"These tools as they exist today can help you if you already have competence. They cannot give you that competence. At best, they can give you a dangerous illusion of mastery; at worst, well, who even knows? Lost data, leaked PII, wasted time, possible legal exposure if the project is big enough--the 'worst' list goes on and on"
These comments feel both relevant and important. Transposing a bit, the passage says if you already write well, at the level of competence, then the tools can be useful. You will not he does not say a word about the tools being morally admirable. Just useful. The tools cannot, however, bestow competence upon the incompetent. Or, I shall add, on the lazy, the over-eager, the woefully naive, or the unscrupulous.
I'll venture further afield. Don't expect AI to make you better than you are. Don't expect it to teach you anything if you are not already adept at learning. [ex-professor aside: learning is itself a craft that must be learned; it's not merely an event that happens]
And, at least for the time being, if you aren't up to the job going into the job, there's a fair chance the job is going to be botched. Perhaps worse, the AI will be good enough to fool you into thinking you're done, when in fact you're done for.
So, handle with care. I won't say here be dragons, but the dragons are damned close by. Best sharpen your own swords first.
The path taken by the article's author to reach his conclusion is illustrative in itself, especially if you've a bit of a background in programming. It provides some good examples of How Not To.
Here's the link to the article.
arstechnica.com
"These tools as they exist today can help you if you already have competence. They cannot give you that competence. At best, they can give you a dangerous illusion of mastery; at worst, well, who even knows? Lost data, leaked PII, wasted time, possible legal exposure if the project is big enough--the 'worst' list goes on and on"
These comments feel both relevant and important. Transposing a bit, the passage says if you already write well, at the level of competence, then the tools can be useful. You will not he does not say a word about the tools being morally admirable. Just useful. The tools cannot, however, bestow competence upon the incompetent. Or, I shall add, on the lazy, the over-eager, the woefully naive, or the unscrupulous.
I'll venture further afield. Don't expect AI to make you better than you are. Don't expect it to teach you anything if you are not already adept at learning. [ex-professor aside: learning is itself a craft that must be learned; it's not merely an event that happens]
And, at least for the time being, if you aren't up to the job going into the job, there's a fair chance the job is going to be botched. Perhaps worse, the AI will be good enough to fool you into thinking you're done, when in fact you're done for.
So, handle with care. I won't say here be dragons, but the dragons are damned close by. Best sharpen your own swords first.
The path taken by the article's author to reach his conclusion is illustrative in itself, especially if you've a bit of a background in programming. It provides some good examples of How Not To.
Here's the link to the article.
So yeah, I vibe-coded a log colorizer—and I feel good about it
Some semi-unhinged musings on where LLMs fit into my life—and how I'll keep using them.
arstechnica.com
Myth Weaver
Scribe
Vala
Troubadour
Archmage