This post will be updated as K-12 schools and universities adjust.
A few schools, like Georgia Tech, are waving a greenish flag of AI support for an essay. I love how this blog breaks it down: fantastic advice and guidelines.
The question I would ask anyone who’s considering using AI on their college essay:
- What does the school you’re applying to say? That’s the first place to check. Don’t use AI if the school says not to.
- Where is the content from? All the content should be about you, not borrowed from the Internet.
- Are you using AI the way someone would use a tutor? Are you asking for feedback, not replacement words? Are you asking so you can learn how to be a better writer, or are you asking for AI to rewrite your essay?
- Are you comfortable with giving your data as training material to various tech companies, whose fates are ever changing? Check with your family before you do and choose a tool.
I’m working on a series of AI-ready questions to have for a future season if and when schools allow AI usage.
I have to say: it can’t write a very good college essay yet.
Yes. Using ChatGPT is plagiarism. Success Story essay consulting does not endorse your use of ChatGPT at this time to write your college essay.
I coach students to write to an audience, which is college admission readers. Readers from institutions you are querying are not communicating a policy at the time I’m writing this post. There is no way for me to determine each individual institution’s policy today in the middle of the 2022-2023 application season. There’s no way for me to advise you as to whether it’s wise to use a tool on steroids way beyond Grammarly or a human coach. Once I hear what institutions are choosing to do, I will update this policy as needed.
Right now, ChatGPT does thinking for you, and just like any adult, peer, or coach doing thinking for you, that is academically dishonest.
Just because you can, that don’t mean you should.– Greg Hawks, “Just Because You Can”
And just because ChatGPT exists, doesn’t mean we need to upend everything we’re doing, today.
Wait, What’s Plagiarism Again?
It’s not only copying someone else’s words (a bot’s, too) without quoting and citing, but it’s also letting someone else do all your thinking and not giving credit.
The difference between good coaching and editing and bad coaching is what? Good: making sure the student’s ideas and voice are always front and center and that the student follows a process led by great questions. Bad: putting your ideas and voice into the student work and doing the thinking for the student.
- ChatGPT does the thinking for you, robbing you of a chance to do self-reflection, to analyze and evaluate, and to wrestle with finding the right words in the right order. In seconds, a whole first draft is generated.
- I don’t use “robbing” lightly. Because right now, with no guidelines (stop signs, traffic lights), I as a coach and educator can’t advise you on how to make sure, with this tool, you still engage in critical and creative thinking. Which is what a college essay is. (Fragment for emphasis and voice. ChatGPT right now, by the way, ain’t a fan of fragments.)
- And see my last note on what else you might be robbed of if you take shortcuts now.
- The Association for Writing Across the Curriculum says it well in their “Statement on Artificial Intelligence Writing Tools in Writing Across the Curriculum Settings“: “As scholars in the discipline of writing studies more fully explore the practical and ethical implications of AI language generators in classroom and other settings, we underscore this: Writing to learn is an intellectual activity that is crucial to the cognitive and social development of learners and writers. This vital activity cannot be replaced by AI language generators.“
- Some schools don’t have policies about whether or not you can use it for your application. But it’s safe to assume that just like you’d need to state for the record that your essay coach, parent, friend, sibling, or mercenary essay salesperson wrote your essay for you, so did ChatGPT.
- That’s not a good plan right now, sharing that you did so, unless you can prove that you worked with the tool in some super-creative, groundbreaking ways that might impress one of the schools you’re seeking. I’m not even sure what that would be at this stage. And I’d say it’s a rare student who can thread that needle.
- Schools–whether colleges or K-12–haven’t had time to assess how their curricula can evolve beyond anything AI can do right now. Schools have barely drawn a breath in the face of the tool to “teach to the tool,” if that makes sense. And I do think we should learn how to partner with it, and as nimbly as we can. But until then? You will still be expected for the near future to write essays in college, to write cover letters for job applications, and generally know how to use words and critical thinking to express yourself. So the college essay application process is an important dress rehearsal for that.
- For more on this topic, read my post, “How Much Should Essay Tutors Edit?”
From Whence Comes the Temptation
Author Barry Gilmore notes in his book Plagiarism: Why It Happens and How to Prevent It a variety of reasons students might plagiarize, ranging from “This will make my life soooo much easier,” to “I didn’t really know that’s plagiarism,” to time or economic pressures, to cultural expectations, plus other factors.
I get that pressures are indeed there. They are also not a justification or excuse for submitting work with huge assistance and not knowing that it is huge assistance.
What ChatGPT does right now is exactly like me writing you your first draft and handing it to you to edit. Then repeatedly returning it to you, again and again, with better and better drafts. Within seconds.
But Can’t I At Least Partner with It?
I think partnering with it in the future could be very cool. I just wouldn’t innovate or experiment this winter or spring with the tool on your college application.
With enough coaching and the right prompting, you can work with it until you get it to be more you and less ChatGPT. That editing of the prompts, or coaching of the AI, is a critical thinking exercise for sure, and I’m not in the least against that evolution of our learning methods. But the way your current college applications are designed, they are assuming that you are submitting your own work. This particular hoop you are jumping through is designed for a world pre-November 2022.
I’ll be posting about how we might partner with this tool in the future. Just not today.
[If your brain’s already ready for that path, Glen Kleiman of Stanford already has some very cool insights and framework (thank you!) to help me think about this. Check out “Teaching Students How to Write with AI: the SPACE Framework.“]
I also like the guidance of authors out there trying to navigate this brave and weird new world.
What are examples of using AI unethically as an author?
Cutting and pasting generated text, willy nilly, without checking and adapting the output, then passing off these words as your own.
AI tools can be used for idea, character and story generation as well as text generation. It is your job as an ethical author to edit and curate the words generated by a tool you’ve prompted, and to ensure the text is not, for example, derogatory or offensive. (From Lyn: this is what I’m thinking the future could be for college apps or other writing experiences in K-12 and elsewhere. Just not today.)
If you do use text generated by an AI tool, as well as editing, run the final work through a plagiarism checker to ensure you have not unwittingly infringed someone’s copyright. (Again, someday)– Alliance of Independent Authors
Wait–Adults Get to Play With It and Yet I Don’t…?
I hope you do play with the tool sometime. Just not on a high-stakes venture such as your applications.
And trust me, when I use two metaphors–The Wild West and Back to the Future–I do so because the ethical questions raised by reality and fiction (whose land/whose work/who owns what, whose choices, whose justice, whose kudzu is being unleashed on native plants, all of that) are very much in play here. My post on What ChatGPT Means for Us Non-Humans says more about stakes.
I believe someday–and maybe as soon as within the next year–the definition of academic dishonesty will be changing to something that allows the use of AI in certain ways and contexts. As Cynthia Alby discusses in her forward-looking article, there’s a lot of potential here to partner with machines. And some colleges may be doing it now. But I can tell you, admissions offices don’t have time to pivot today.
Let’s say, for the sake of argument, that basic writing will now be generated by AI initially, tweaked with a few additional prompts, and then polished by the human writer. Or for more creative, personal, analytic, or cutting-edge writing, the work will start with the human writer and be polished in the end by AI. We’d all continue to do a lot of “writing to learn”—writing to help us think, brainstorm, and make sense of our thoughts and feelings, with no desire to use AI. Would all this be so terrible? Already when I write I use loads of “assistants.” Word checks my spelling and grammar, and I use its thesaurus to find the ideal word. Google helps me fact-check. A friend might make suggestions. Citation generators help with APA style. If I am writing something that follows conventions I am less familiar with, such as a book proposal or grant, I look at others’ examples. Is any of this “cheating?” How might what constitutes as “cheating” change in the age of AI?– Cynthia Alby, “ChatGPT: A Must-See Before the Semester Begins.”