The only thing that's going to matter is going to be the objective quality of a piece of writing. I think in 10 to 15 years that is the only thing that'll matter. Just as when I walk into a club or some bar and I hear the song, I don't care how it was made. I don't care who made it. All I care about is that the song is a vibe. I think it's going to be the same thing with writing. I don't care if AI wrote it for you. I don't care if AI wrote it with you. I don't care if you wrote it by yourself.
— David Perell
What are the unwritten rules of writing with AI?
✅ Assisted by AI? Ok sure, we all do this.
❌ Written by AI? Unacceptable. No different than plagiarism.
That’s how I see things right now.
(copywriting is a different conversation)
The problem is these rules are not clear.
Being assisted by AI isn’t cheating. But where do you draw the line when it comes to getting assistance? What’s the difference between incorporating written feedback from a human being vs an LLM?
Using AI to learn about a topic is a reasonable thing to do.
Large language models are remarkably efficient interfaces for conducting research. I see nothing wrong with this. I draw the line when it comes to generating significant output that I claim as my own.
Ok, but wait.
What is considered significant?
Is it ok if I have AI re-write something I wrote?
So many questions and thoughts to explore here.
Having AI write 80% of an essay for you, and not disclosing this, would probably be a reputation killer if anyone found out.
Having AI write anything “creative” for you still carries a stigma. It’s uncommon for anyone to admit it, even if it’s painfully obvious.
Curious to hear your thoughts.
— Daniel