human hand and robot hand pointing to each other as in Michelangelo's hand of god on the Sistine chapel (black and white)

ChatGPT Did Not Write This Update

It’s true, ChatGPT (or any other AI writing tools) did not write this update, but I HAVE been playing around with it, and I can see the appeal, at least in part. 

But there has been A LOT of conversation in the community about these tools.

I, like many other writers, experience some form of writer’s block or another from time to time. It’s not usually the “I don’t know what to write about” kind, but usually the “how do I get started with this post” kind of thing. And for that, AI writing tools are amazing. “Give me an outline for…” seems a good prompt and a great place to start.

Someone asked me if using AI for writing is plagiarism. I asserted that it isn’t, as it’s not taking someone else’s writing found published elsewhere and presenting it as your own. I see it as a tool and even as part of research. But I’m curious about what your thoughts are.

  • Is using an AI writing tool plagiarism?
  • Is it a threat to content writers?
  • Is it a crutch?
  • Or is it simply another tool?

Let me know what you think.

In the Spotlight

Post Status professional member, Talisha Lewallen is in the spotlight this week. Talisha is the CEO and Founder of WPConnects, and the Founder and President of the CertifyWP Foundation.

Weekly Member Huddles

We’re restructuring in 2023, and expect to have news about this soon.

Post Status

You — and your whole team can Join Post Status too!

Build your network. Learn with others. Find your next job — or your next hire. Read the Post Status newsletter. ✉️ Listen to podcasts. 🎙️ Follow @Post_Status 🐦 and LinkedIn. 💼

Similar Posts


  1. Good questions, I’ve been thinking about them.

    It seems like plagiarism if you were to pass off 100% machine-generated text as your own, but there’s no theft of any original work belonging to someone else. It’s likely to be very illegal in some cases that are treated as forgery and a firable offense in others that violate an industry’s ethics, much like plagiarism is.

    Interestingly, it might be different for visual art, music, and software. You can make a better claim that your original work in these categories is being used without credit or compensation by a machine learning model that’s fed things that are not in the public domain. But then, isn’t that what human learning does too?

    Being fluent in a tradition, genre, or language requires us to have absorbed a lot of other people’s stuff, and what we create we say is our own, but we know it is more and less consciously influenced by others. There’s a hazy line where that influence is clear and when it becomes so strong it is considered derivative to the point of being a ripoff or cheap knockoff.

    I haven’t looked closely at Matthew Butterick’s case against GitHub Copilot, but it seems to me that open-source code is meant to be learned from by others, who then create new code with what they’ve absorbed. Why not have Copilot just help you do this faster at a basic level? How is that theft? It might be cheating if you pass it off as 100% your own work. But even that doesn’t matter in many cases.

    No great literature, music, or visual art has ever been the sole work product of a single mind. Apprentice painters do the background bits for the masters. Books have been the same for much of their history — many hands construct them. Authors work with editors.

    I think AI-generated work is a tool, and all tools can be crutches — or weapons.

    The “threat” question is a tough one. I think “writers” have nothing to fear worse than they always have had to deal with. “Content writers” are increasingly victims of “content” (much of it theirs) being a synonym for “crap.” People who want to churn out anonymous mediocre marketing material that isn’t supposed to be original or conversational have never wanted to pay much for it — or respect it as a vocation.

    So I think it’s likely and a good thing for machines to be the only “content writers” who are guided, reviewed, and enhanced by human writers, editors, and field experts. I hope this leads to better and more restricted use of this kind of content — and an expanded interest in the value of unique human minds and voices — fluent in writing and speaking human languages.

  2. I’m interested in the potential outcome of this case, which could have ramifications for attribution when producing AI content:

    The point regarding plagiarism is really interesting. In school and university, we were taught to reference our sources, and I think that the biggest danger with AI – both in code and text – is the lack of attribution as to where the output is derived from.

    The other interesting aspect is the type of information that is being fed into the models, and whether any type of copyright decisions will impact on the quality of the output.

    For instance, if a decision is made that copyrighted content needs the explicit acceptance of the author, and say, no Booker Prize nominated author provides this, so the models are trained on free novels found on Amazon…then will the output quality only ever reach a certain level of quality? Would this have an impact?

    Whilst I am sure there will be improvements, AI content is fairly easy to spot right now as it is generic, bland and lacking the voice or style that a human author can bring. Without high quality content going in the top of the funnel, is it possible to get to the stage where the content has the secret sauce that comes from a real person with real-world experience and point of view that only they can bring?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.